Developing guidance for selecting evaluation methods
Situation:
Our client needed a tailored resource to guide its evaluation specialists and commissioners in selecting the most suitable methods for evaluating a diverse range of programs. The resource had to ensure value for money and provide a framework for synthesising findings across a portfolio of programs. The department aimed to enhance its evaluation capability by developing a comprehensive toolkit that would determine the most appropriate evaluation techniques for different types of programs, particularly those focused on innovation, transformation, and broader portfolio objectives.
Task:
Grosvenor was engaged by the client to create a comprehensive and user-friendly resource that demonstrated its skills in evaluation, particularly in applying diverse evaluation methods within a government context. The primary objective was to develop a toolkit that would enable the department’s Strategic Evaluation team to effectively prioritise and select suitable evaluation approaches based on the unique context and goals of their programs. This resource needed to accommodate the client’s varied program portfolio, which includes initiatives aimed at innovation, industry transformation, and sustainability, while supporting the evaluation of both individual programs and collective portfolio objectives.
To accomplish this, the toolkit needed to be specifically tailored to the department’s context, addressing the complex challenges associated with evaluating programs of varying sizes, scopes, and intended outcomes. Additionally, the resource had to incorporate best practices and methodologies aligned with government standards while offering practical guidance on applying these methods across different program types. The final deliverable aimed to synthesise findings and provide actionable recommendations, enhancing the department’s ability to conduct efficient and effective evaluations that demonstrate value for money.
Actions:
Understanding the Context: Grosvenor, led by Kristy Hornby as Project Director, started by gaining a thorough understanding of the client’s program evaluation context. The team reviewed a wide range of sample evaluation plans and program typologies provided by the client to identify the types of programs typically requiring evaluation, such as those designed to facilitate innovation and transformation within industries. This analysis highlighted the unique challenges of evaluating a portfolio of programs collectively to determine if the overall objectives were being achieved.
Developing Fact Sheets: To showcase skills in evaluation and provide practical guidance, Grosvenor developed 12 detailed fact sheets, each focusing on a specific evaluation approach or technique, demonstrating their delivery methods within the government evaluation context. The fact sheets covered diverse topics such as:
- Reflective Practice: Methods for using ongoing reflection to refine program design and implementation.
- Evaluating Innovation: Techniques for assessing programs aimed at driving innovation, where traditional evaluation methods may not apply.
- Participatory Evaluation: Approaches that involve stakeholders in the evaluation process to ensure relevance and buy-in.
- Portfolio-level Evaluation: Strategies for assessing multiple programs together to determine whether portfolio-level goals are being met.
Each fact sheet was designed to be user-friendly and provided concise yet comprehensive guidance on when and how to apply each evaluation method, including practical examples and case studies illustrating real-world applications.
Developing the Guidance Document and Prioritisation Model: Grosvenor created a guidance document to help the client navigate the fact sheets and apply a prioritisation model to specific program evaluation needs. This document was structured to assist users in understanding which evaluation approaches were most appropriate for different program types and contexts. The prioritisation model provided a clear, step-by-step process for selecting the best-suited evaluation method, considering factors such as program objectives, stakeholder needs, available resources, and desired outcomes.
Delivering Training: Grosvenor conducted a comprehensive training session for the Strategic Evaluation team on effectively using the new resource, focusing on delivering practical skills in evaluation. The training emphasised applying the prioritisation model and two of the fact sheets identified as being of particular interest: ‘Evaluating Innovation’ and ‘Portfolio-level Evaluation.’ The session included interactive elements, such as case study exercises and group discussions, reinforcing learning and ensuring participants felt confident in using the toolkit in their daily work.
Collaborative Development: Throughout the project, Grosvenor maintained a close and collaborative relationship with the client project team. Regular workshops and feedback sessions were held to iteratively develop, test, and refine the resource. This collaborative approach ensured that the final product was tailored to meet the client’s specific needs, incorporating all feedback from the department’s stakeholders, and ensuring greater acceptance and usability of the final toolkit.
Results:
The cleint’s project team accepted all deliverables and commended Grosvenor for the high quality of the product. They specifically noted the effectiveness of the toolkit in guiding their evaluation specialists to select appropriate methods tailored to their unique programmatic needs. The department appreciated the iterative approach Grosvenor used, which ensured that their feedback was consistently incorporated, resulting in a resource that was well-aligned with their objectives and easy for their staff to use.
Impact and Value: The toolkit enhanced the client’s internal evaluation capabilities by providing a clear, structured approach to selecting evaluation methods that ensure value for money. The resource was particularly valuable in contexts where programs were aimed at fostering innovation or required evaluation at the portfolio level. By embedding practical examples and step-by-step guidance, the resource empowered evaluation specialists to make informed decisions, ultimately leading to more effective and efficient program evaluations across the department.
Sustainability: The training provided by Grosvenor ensured that the department’s staff could independently use the resource and apply its principles to future evaluations. The Plain English design and practical focus of the fact sheets and guidance document also made the resource accessible and easy to understand, promoting its long-term use within the department.
Client Satisfaction: The client expressed satisfaction with the deliverables, noting Grosvenor’s responsiveness to their needs and the high quality of the final resource. They highlighted the resource’s value in enhancing their evaluation processes and its potential to improve the overall effectiveness of program evaluations within the department.
Conclusion:
Grosvenor’s partnership with the client resulted in a tailored, practical toolkit that significantly enhanced the department’s evaluation capacity. The project effectively demonstrated Grosvenor’s skills in evaluation, especially in delivering methods suitable for government evaluations, through a deep understanding of the client’s programmatic context, meticulous development of guidance materials, and effective training delivery. The project not only delivered immediate value by improving current evaluation practices but also contributed to long-term capacity building within the department.
The positive feedback from the client underscores Grosvenor’s commitment to delivering high-quality, client-focused solutions. By creating a resource that is both comprehensive and easy to use, Grosvenor helped the achieve its objective of ensuring value for money in program evaluations. This project highlights Grosvenor’s expertise in developing customised evaluation frameworks that are responsive to the unique needs of government clients, further solidifying its reputation as a trusted partner in public sector evaluation and program improvement.
For an independent program evaluation contact our Program Evalution team lead Dana Cross