150 likes | 360 Views
Program Evaluation. Presented by Jim Kitterman, Evaluation Specialist Maryland Humanities Council. Why Program Evaluation is Critical. You learn what your audience thinks about your programs and what works well and what needs improvement.
E N D
Program Evaluation Presented by Jim Kitterman, Evaluation Specialist Maryland Humanities Council
Why Program Evaluation is Critical • You learn what your audience thinks about your programs and what works well and what needs improvement. • Funders often request proof of program efficacy as part of funding requests.
The Logic Model is… • … a way to think abstractly about a process, program, or business. • … way to understand what is going on in an organization—from start to finish • …a way to communicate about business problems by all who contribute to the process
Simplest form of logic model INPUTS OUTPUTS OUTCOMES 4 University of Wisconsin-Extension, Program Development and Evaluation
Logical chain of connections showing what the program is to accomplish INPUTS OUTPUTS OUTCOMES Activities Participation Short Medium Long-term Program investments What we do Who we reach What we invest What results 5 University of Wisconsin-Extension, Program Development and Evaluation
Fully detailed logic model 6 University of Wisconsin-Extension, Program Development and Evaluation
The Logic Model is useful for… • Planning a new program • Identify and define community problems • Understand what impacts the problem • Evaluating an existing program • Improve program efficiency, quality, results • Prove program value to stakeholders • Allocate resources
Logic model and common types of evaluation Types of evaluation Process evaluation: How is program implemented? Are activities delivered as intended? Fidelity of implementation? Are participants being reached as intended? What are participant reactions? Needs/asset assessment: What are the characteristics, needs, priorities of target population? What are potential barriers/facilitators? What is most appropriate to do? Outcome evaluation: To what extent are desired changes occurring? Goals met? Who is benefiting/not benefiting? How? What seems to work? Not work? What are unintended outcomes? Impact evaluation: To what extent can changes be attributed to the program? What are the net effects? What are final consequences? Is program worth resources it costs? 8 University of Wisconsin-Extension, Program Development and Evaluation
Levels of Evidence • Proof of Cause and Effect • Validated assessment instrument • Pre and post service testing • Comparison Group (random or matched) • Evaluator Independence • Peer Reviewed • Evidence from many studies
Levels of Evidence • Proof of Performance • Level of proof sufficient for intended audience • Design has to fit within program operational and resource constraints • Benchmark comparison—prior periods, internal goal, results from other organizations
Form Design/Outputs • Event Log • Registration Information • Administrative Forms
Form Design/Outcomes • Design survey to address program goals • Question types • ratings • open ended • demographics • Audience Types • Participants • Partners • community
Form Design/Outcomes (continued) • Questions Design • Ask one question at a time • Make question unambiguous • Get ideas from other survey forms • Rating scales—3 to 10 point
Types of Surveys • Paper—ex: MHC program participant survey • Scannable paper—ex: MHC program participant scannable survey • Electronic—ex: Survey Monkey
Online Resources available for more information about Logic Model Evaluation • Wikipedia • Tutorial On-Line (U. of Wisconsin Extension) • W.K. Kellogg Foundation • Logic Model Development Guide • Evaluation Handbook