340 likes | 483 Views
Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice. Monica Hargraves, PhD Manager of Evaluation for Extension and Outreach Cornell University October 2009 mjh51@cornell.edu. Brief outline. What we do (The “Evaluation Partnership” approach)
E N D
Building Evaluation Capacity:The “Systems Evaluation Protocol” in Practice Monica Hargraves, PhD Manager of Evaluation for Extension and Outreach Cornell University October 2009 mjh51@cornell.edu
Brief outline • What we do (The “Evaluation Partnership” approach) • Key steps in the training (with stories from extension program partners) • “Swimming against the tide…” • Making it feasible and sustainable
Evaluation Partnerships Evaluation Partnerships • CORE provides training, brings evaluation expertise • Partners bring experience, expertise in their programs, their communities, their “systems” Planning Phase is a one-year commitment, with intentions and clarity of roles captured in an MOU
What the EP entails, in the “planning year” Stages: • Preparation for Partnership (Jan – March) • Modeling (intensive!) (April – June) • Evaluation Planning (July – Oct/Nov) Formats this year: • Two in-person, full-day training meetings • Web-conferences • Listserve, e-mail, phone support
History of the Project within Cornell Cooperative Extension (CCE) 2006: NYC
History of the Project within Cornell Cooperative Extension (CCE) 2007: Chenango, Jefferson, Onondaga, St. Lawrence, Tompkins, Ulster
History of the Project within Cornell Cooperative Extension (CCE) 2009: Chemung, Chenango, Clinton, Cortland, Franklin, Fulton & Montgomery, Genesee, Jefferson, Madison, Monroe, Oneida, Ontario, Oswego, Rensselaer, Saratoga, Seneca, Tioga, Tompkins, Ulster, Wayne
Stakeholder Analysis Other Youth Programs NYS 4-H Cornell University SUNY Morrisville Cobleskill Local School Districts Taxpayers CCE Staff Youth Funders JCADCA Dairy Program Volunteers Parents NYS Jr. Holstein Association Local Ag Businesses FFA Teachers Breed Associations CCE Board of Directors National Dairy Industry 4-H Members Jefferson County Fair Board Surrounding County Youth Jefferson County Dairy Producers State Fair Jefferson County Legislatures Media CCE-Jefferson 4-H Dairy Program: Stakeholder Map
Quick “poll” on formal modeling … Think of programs you are evaluating, or wish to evaluate. How many of those have a written-down model (Logic Model, or something similar)? A – all B – many C – some D – few E – none
Pathway Model Development 4-H “SET-To-Go” (an after-school science program), CCE-Cortland County Pathway Model, October 2009
Comments from an Evaluation Partner… Shawn Smith 4-H Issue Area Leader & Evaluation Project Manager CCE – Cortland County (CCECC)
Program Life Cycle Impact transformation Time initiation maturity growth Source: Program Leadership Certification, “Accountability and Evaluation” PowerPoint, Michael Duttweiler
Quick Poll on Program Lifecycles Think about a program you are evaluating or are going to be evaluating What lifecycle stage is it in? A – early development, pilot B – still revising/tweaking C – implemented consistently D –consistent across sites/facilitators and documented E –well-established, stable, candidate for replication
Program & Evaluation Alignment Program Lifecycle Evaluation Lifecycle Is program in initial implementation(s)? Process assessment and post-only evaluation of participant reactions and satisfaction. Phase IA Process & Response Phase I Initiation Post-only assessment of outcomes, implementation assessment, outcome measurement development and assessment of internal consistency (reliability). Is program in revision or reimplementation? Phase IB Unmatched pretest and posttest of outcomes, qualitative assessment of change, and assessment of reliability and validity of measurement. Is program being implemented consistently? Phase IIA Phase II Development Change Does program have formal written procedures/protocol? Matched pretest and posttest of outcomes. Verify reliability and validity of change. Human subjects review. Phase IIB Controls and comparisons (control groups, control variables or statistical controls). Is program associated with change in outcomes? Phase IIIA Comparison & Control Phase III Maturity Controlled experiments or quasi-experiments (randomized experiment; regression-discontinuity) for assessing the program effectiveness. Does program have evidence of effectiveness? Phase IIIB Evaluation Special Projects Multi-site analysis of integrated large data sets over multiple waves of program implementation. Is effective program being implemented in multiple-sites? Phase IVA Phase IV Generalizability Dissemination Formal assessment across multiple program implementations that enable general assertions about this program in a wide variety of contexts (e.g., meta-analysis). Is evidence-based program being widely distributed? Phase IVB
Determining Evaluation Scope • It’s all about making GOOD CHOICES… • What kind of evaluation is appropriate for the program lifecycle stage? • What are the key outcomes this program should be attaining? • What do important stakeholders care most about? • What will “work best” inthis kind of program? • What kind of evaluation is feasible for this year? What should wait until a future year?
Activity Activity Activity Activity Short-Term Outcome Short-Term Outcome Short-Term Outcome Determining Evaluation Scope Components 1 1 Key Outcomes Key Links Output Output Output Output 1 Key Pathway Stakeholders 1 2 2 3 3 Internal Priorities 2 Scope Middle-Term Outcome Middle-Term Outcome Middle-Term Outcome Middle-Term Outcome 3 2 Long-Term Outcome Long-Term Outcome
Comments from another Evaluation Partner… Linda Schoffel Rural Youth Services Program Coordinator CCE – Tompkins County (CCETC)
Using the Pathway Model for making evaluation choices – RYS Rocketry Program
Evaluation should “fit” the program …(lifecycle, stakeholders, context, etc.) Do youth who participate in RYS Rocketry feel like they are part of the group? (belonging) RYS Rocketry Program, CCE-Tompkins County Pathway Model, October 2009
“Swimming against the Tide” The most frequently cited challenge to program evaluation is lack of time. The systems approach involves spending a lot of time before you even get to the point of choosing measures… Programs often face significant pressure for more evaluation, and for evidence of “impact” … The systems approach argues, essentially, that “less is more” if the evaluation truly “fits” the program
Making it feasible for the long-term • Key ingredients that help: • focusing on the most valuable elements (choosing well) • identifying interim benefits of the process • integrating with other needs • building on others’ progress • sharing resources
Wrapping Up … Thank you! Any questions for any of us, before returning to Bill…? For follow-up questions later, Monica Hargraves: mjh51@cornell.edu Shawn Smith: scs239@cornell.edu Linda Schoffel: ljs48@cornell.edu Also see our website at http://core.human.cornell.edu/