260 likes | 442 Views
Evaluation Capacity Building. Identifying and Addressing the Field’s Needs. Learning Objectives. Understanding of the field’s current evaluation capacity Understanding of CNCS’s strategy for building the field’s evaluation capacity. Assessing Current Capacity.
E N D
Evaluation Capacity Building Identifying and Addressing the Field’s Needs
Learning Objectives • Understanding of the field’s current evaluation capacity • Understanding of CNCS’s strategy for building the field’s evaluation capacity
Assessing Current Capacity • Discussions with State Service Commissions and National Direct Grantees • Reviews of AmeriCorps Grantee Evaluation Plans and Reports • Focus group with AmeriCorps Program Officers • Inventory of CNCS Evaluation Resources and Tools
Key Discussion Questions • What materials, tools, and technical assistance are needed by grantees in order to fulfill their CNCS evaluation requirements? • What is the grantee’s current capacity to conduct evaluations? • What is the State Commission’s current capacity to support and advise their sub-grantees on evaluation? • What materials are needed by the State Service Commissions in order to set their own evaluation standards and assist their sub-grantees in fulfilling their evaluation requirements?
Findings from the Field • Grantees who participated in this assessment have limited knowledge of basic evaluation methodology and concepts • Both grantees and state service commission staff are unclear of CNCS’ evaluation requirements, specifically when products are due (e.g., evaluation plans and reports) • Both grantees and state service commission staff need guidance to determine which evaluation designs will allow them to demonstrate project impact and be in compliance with the requirements
Findings from the Field • Grantees are not allocating sufficient funds to conduct experimental and quasi-experimental evaluations • Grantees do not readily distinguish between performance measurement and program evaluation • Grantees want access to resources other than their local evaluator for advice on their evaluations
Reviews of Evaluation Plans and Reports • The CNCS Research and Evaluation office reviewed 33 small (less than $500,000) applicants to assess evaluation plans and reports • The NORC team reviewed 23 large ($500,000 or more) applicants to assess evaluation plans and reports
Assessment Forms: Evaluation Plans & Reports • Description of the intervention/program • Problem/Issue statement • Program impact theory • Evaluation questions and design • Evaluation objectives and research questions • Evaluation methodology/design • Outcomes
Assessment Forms: Evaluation Plans & Reports • Data collection methodology and procedures • Types and sources of data collected • Population/Sample • Data analysis • Evaluation results and conclusions • Intended use of evaluation results/evaluation results • Conclusions and potential next steps (Reports only) • Limitations of the evaluation (Reports only)
Assessment Findings: Small Grantees • Most evaluation reports did not meet CNCS evaluation requirements as defined in the CFR • Evaluation plans did not describe evaluations capable of determining program impact in accordance with the CFR • Many grantees seem to equate performance measurement and monitoring with program evaluation
Assessment Findings: Small Grantees • Capacity for evaluation is promising • Most applications described implicit theories of change connecting community needs to program resources, activities, outputs and outcomes • Some applicants described program models informed by evidence such as prior program performance data, peer-reviewed research on similar practices or activities, and research on national models
Assessment Findings: Small Grantees • Evaluation Capacity Continued… • Many of the applicants described reasonable pre/post program outcomes and had quality data sources ; some applicants also gathered data post-program participation • All of the evaluation reports or summaries reviewed (with 1 exception) were process or formative evaluations – this is an important program practice to ensure quality programming
Assessment Findings: Large Grantees • Only two (of 6) evaluation plans reviewed included sufficient detail about the evaluation approach to assess particular aspects of the design • A majority of grantees are not designing and implementing evaluations that address questions about program impact in accordance with the CFR
Assessment Findings: Large Grantees • Capacity for evaluation is promising • A few grantees are moving in the right direction by conducting evaluations that gather pre and post data • A few grantees are implementing experimental or quasi-experimental designs
Focus Group Discussion Themes • What feedback have you received on the quality and usefulness of the CNCS evaluation resources and tools that are available to applicants and grantees? • What questions do you typically receive from AmeriCorps grantees about their evaluations? • Are there specific types of technical assistance that are requested by AmeriCorps grantees on evaluation design and implementation? • Are there additional materials, tools, or assistance that AmeriCorps grantees need in order for them to successfully design and execute evaluations of their projects? • What training and support do you need to assist your grantees in fulfilling the evaluation requirements?
Focus Group Findings • Both Program Officers and grantees expressed an interest in learning more about evaluation and evidence • There is a need to increase awareness and understanding of CNCS’s evaluation requirements
Inventory of CNCS Evaluation Resources and Tools • CNCS evaluation resources and tools available to grantees via the Knowledge Network were reviewed and inventoried. • A systematic method of reviewing and inventorying all evaluation material included developing a mapping of the contents and developing an assessment form to standardize the review of the material.
Inventory Findings • Evaluation resources were not easy to access via the Knowledge Network website • Many resources were dated and did not reflect the most current thinking in evaluation methods • Informational gaps include guidance on: • evaluation planning • evaluation management • data analysis • use of existing data
Strengthening Capacity Findings from these activities have informed our capacity-building strategy for the coming year
Evaluation Capacity Building: Summer 2013 • Disseminate Evaluation FAQs • Conduct Webinars: • Evaluation Capacity Building • CNCS Evaluation Requirements and FAQs • Create central location for evaluation materials on website
Evaluation Capacity Building: Fall 2013 • Grantee Symposium, Performance Measurement & Evaluation Track • Performance Measurement Pre-Conference Workshop • Developing evaluation plans and reports • Evaluation 101 • Initiate individualized technical assistance • 1:1 coaching to help finalize evaluation plans, use performance monitoring and evaluation for program improvement, and develop evaluation reports
Evaluation Capacity Building: 2013 - 2014 • Individualized technical assistance • Evaluation Core Curriculum (webinars, online courses, downloadable resources) • Developing evaluation plans and reports • Evaluation 101 • Logic models • How to report evaluation findings • How to manage an external evaluation • Budgeting for evaluation
Evaluation Capacity Building: Your Technical Assistance Team • AmeriCorps Program Officers • Primary point of contact • CNCS Office of Research and Evaluation Staff • Partnered with program officers to provide evaluation expertise • NORC Team • Partnered with program officers and research/evaluation staff to increase CNCS’s evaluation technical assistance capacity
Q & A ?????