250 likes | 429 Views
Evaluation Capacity Building Sampler. Michael Duttweiler , Assistant Director for Program Development and Accountability Monica Hargraves , Manager of Evaluation for Extension and Outreach Cornell Office for Research on Evaluation. Topics. A little context about CCE
E N D
Evaluation Capacity Building Sampler Michael Duttweiler, Assistant Director for Program Development and Accountability Monica Hargraves, Manager of Evaluation for Extension and OutreachCornell Office for Research on Evaluation
Topics • A little context about CCE • Supporting and Sustaining Evaluation Practice • Considerations • Traditional Approaches • Not-so Traditional Approaches • Featured Evaluation: Family Economics and Resource Management
Cornell Cooperative Extension • Large and decentralized system • Local units are subordinate agencies of state government in partnership with county government and Cornell • Local staff are not Cornell employees • Strong local determination of programming • Local funding dominates at the local level Statewide evaluation practice is glued together by common needs, good will, and strong relationships.
Quick Poll – Agree/Disagree “The biggest barrier to stronger program evaluation in extension is lack of practical evaluation approaches and tools.” = Agree = Disagree
Supporting and Sustaining Evaluation Practice • Concept mapping process involving program staff, county directors, extension administration, and evaluation consultants “Concept mapping seeks the open contribution of participant stakeholders’ ideas on a specific issue, organizes the ideas, and portrays them in pictures or maps that are readily understood.” Kane and Trochim, 2007 “Concept Mapping for Planning and Evaluation” • Focus question: One specific thing an Extension organization can do to support the practice of evaluation is…"
Conclusions … • There is a strong human element inherent in supporting and sustaining evaluation practices. • People at all levels of an organization, including paid and volunteer staff, need to have their own good reasons to care about evaluation. • Evaluation Policies should include all phases of evaluation: planning, implementation, and utilization • For example, “good use” of evaluation is important in motivating staff • Effective evaluation is not just about the programs, it’s about the organization. • Evaluation should be integrated into organizations in all ways – into job descriptions and performance reviews, strategic planning, staff discussions, external reporting, proposal development, etc.
Traditional Capacity Building in CCE • In-service Education • “Getting Started with Program Evaluation” • “Exploring Public Value” • Web Resources • Course Modules • References and Tools • Technical Assistance • Plan of Work Outcomes and Reporting • Targeted Program Evaluation
Program Leadership Certificate • What it is • Comprehensive professional development experience • At least 12 modules and an applied project over ~18 months • Evaluation Components • Intro to Program Evaluation • Accountability and Evaluation • Evaluation Topics • Applied Projects
Evaluation Partnerships (EP): usingThe Systems Evaluation Protocol • EP Planning Phase entails … • Program recruitment and selection, MOU, cohort formation • Extensive program modeling • Evaluation Plan development • Mix of in-person trainings, web-conferences, listserve etc. • EP Implementation support phase entails … • Program and evaluation timelines clarified • In-person “Kickoff” meeting with presentations, experts and resources supporting hands-on work time • Follow-up Q&A web-conferences on topics as needed • End-of Year “Capstone” meeting for closure and accountability
Sequence of EP Cohorts in CCE Rough counts: # associations … 34 with EP programs 27 with EP-trained staff in-house # CCE staff participants … over 200 staff on Netway through EP 54 staff trained directly by CORE as EPMs or as working group members # Programs … 102 programs listed in Netway (37 added since the initial EP work) Cohort Year (Planning Phase) 2006 2007 2007 and in 2009 cohort 2009 2009 and in 2010 cohort 2010 ( indicates associations where 2010 cohort staff are located) July 2010
Evaluation Partnership Lessons • Sustaining evaluation • Key is to integrate evaluation with existing work (program development, reporting, funding, etc.) • Higher levels of the “system” need to be aware, involved, supportive • Side-benefits matter • Staff understanding of program; improved “ownership by staff, volunteers, other stakeholders; • Builds a network of educators with common tools and language, and better opportunities for sharing resources and solutions
More on Evaluation Partnerships • October 2009 Webinar “Systems Evaluation Protocol: The Right Tools Through the Evaluation Cycle” • Cornell Office for Research on Evaluation: http://core.human.cornell.edu/ Questions?
Targeted Programs • Work with teams of educators and faculty • Develop an outcome framework • Develop an evaluation framework • Documentation • Immediate feedback • Follow-up feedback
Financial Education Evaluation • 2006 • Modified CCE POW • 2007 • Intro NEFE Evaluation Toolbook • Work group selected priority outcomes • 2008 • Piloted instruments for two programs • Wide use of revised instruments • 2009 • Pilot data summaries • Wide use of standard forms • 2010 • Pilot follow-up survey
Educator’s Perspective • Ann Gifford, Consumer & Financial Management Program Coordinator, CCE Tompkins County
“Making Ends Meet” Highlights • 89% rated program as “useful” or “very useful” • 98% would recommend program • 71-74% indicated increased confidence on five key behaviors • 47-54% said they would take five key actions (12-20% already were taking the actions) • 51% identified one or more new things learned • 31% identified at least one additional thing they would do differently
Follow Up Evaluation Pilot • What? • Small sample (target of 30) follow-up evaluation for Making Ends Meet • Why? • To see if learning “sticks” • Identify practice changes • Solicit feedback • Where? When? Who? • CCE Tompkins • April-May 2010 • Volunteer Interviewer
The Questions (paraphrased) • Things gained? • Changes made as a result of attending the workshop? • Since the workshop, have you: set goals, tracked spending, developed plan, made payments on time? • Learn anything from tracking expenses? Make any changes? • Use any new community resources? • Confidence in managing money? • Any other comments?
Changes Made (93% made one or more changes) • Started tracking (8) • Started saving money (6) • Saving on unnecessary items (4) • Started budgeting (3) • Paying credit cards/bills on time (2) • Reduced credit card debt (2) • Went to financial advisor (2) • Misc. comments (10)
Quotable Quotes “It was very eye-opening to see how much and where I spent my money.” “I realized I had lost control of my credit card spending and did something about it.” “I got a piggy bank for my son and have him save for things he wants.” “I’m able to save money even though I don’t have much to work with.” “The workshop made finances less scary.” “I learned how to be responsible.” “This workshop helped me get my home loan. I have a home!“ “It gave me hope.”
Putting it All Together Optional Local Results Statewide Needs Info Program Description Immediate Feedback Follow-up Results Follow-up Quotes
Next Up: Field Crops Programs • This year: Basic program documentation and immediate feedback pilot testing • Next year: Standard near-term evaluation • Third year: Follow-up impact documentation
Questions/Comments? • Presentation slides, documents, and links will be available at the usual site: http://nc4-heval.wikispaces.com/Webinars+for+2010