E N D
1. Planning for Results: Program Evaluation 101
Presented by Barbara Dunn
for
The Wythe-Bland Community Foundation
January 10, 2008
2. 2 Purpose & Objectives
This presentation will discuss the fundamental role of evaluation in organizational capacity-building.
At the conclusion of this session, participants will:
Distinguish between evaluation for accountability & evaluation for learning (evaluative learning)
Describe basic terms, concepts & processes related to planning & evaluation
Apply this information to their own nonprofit work
3. 3 The Big Picture(Innovation Network, 2005)
The big questions…
How are we doing?
What difference are we making?
Are we meeting our mission & vision?
The answers…
4. 4 Program Planning & Evaluation Model (Va. Dept. of Education, 2002)
5. 5 Program Planning Process
1. Identify community or organizational needs
2. Develop goals to meet those needs
Achieving goals generates an impact on clients/the organization
3. Establish objectives to work toward goals
Meeting objectives produces outcomes for clients/the organization
4. Design structured activities that address established objectives
Implementing activities creates organizational processes
5. Evaluate the results & report your findings
Evaluating the results provides information needed to document success & improve performance
6. 6 Continuous Quality Improvement (CQI) Framework(NCWRCOI & Casey Family Programs, 2005)
7. 7 Organizational Planning Cycles Annual self-assessment:
The self-assessment is diagnostic—it tells you where you are
Evaluate effectiveness of programs, services & management systems against pre-determined objectives with measurable outcomes
Evaluate functioning & effectiveness of staff, committees & Board against established annual goals
Gather information to determine changes in the community’s need for services (needs assessment)
Assess other external issues & events that affect the organization (environmental scan)
(see related handout)
8. 8 Proving vs. Improving(Innovation Network/Evaluation Handbook, 2005) The old thinking: Evaluation to prove something to someone else
The new thinking: Evaluation to improve something for yourself & your community
This new perspective on evaluation emphasizes learning~evaluative learning (TCC Group, 2003)
The goal: A meaningful process & valuable findings
9. 9 An Evaluation Primer
Definition: What is evaluation?
A systematic process of obtaining credible information to be used by interested persons for purposes of assessment & improvement
10. 10 An Evaluation Primer Why is evaluation important?
Within the organization:
To document, verify & quantify activities & their effects
To justify costs & demonstrate efficient & effective use of resources
To improve program effectiveness & service delivery to clients
To provide information for better management, decision-making & planning
11. 11 An Evaluation Primer Why is evaluation important?
For other audiences:
To document achievement of stated goals & objectives (accountability)
To demonstrate effective/efficient operations to funders/supporters
To identify unplanned benefits, unexpected challenges, unintended consequences & other lessons learned
To describe the program for possible replication
To distinguish evidence-based practices (models that work)
12. 12 An Evaluation Primer Does evaluation mean research? Yes & No!
Yes, in that it uses research process—identifying & defining problems, collecting data, analyzing & interpreting information & reporting findings.
No, in that it doesn’t have the same controls or objectivity as formal research & causal inferences cannot be made. Program evaluation is action-oriented & participatory.
13. 13 Definition of Terms Different sources use evaluation terms in various ways that are not always consistent
For example, collapsing or combining terms: impact goal, outcome objective, and outcome indicator
Regardless of the terms used, the following definitions are basic to any program planning and evaluation process
14. 14 Definition of Terms Program: A structured service area or administrative initiative that in some instances may be more appropriately called a project.
These activities are organized to meet certain objectives and produce certain outcomes.
Examples:
Home visiting (service area)
Asthma case management (service project)
Capacity-building (administrative initiative)
Strategic planning (administrative project)
15. 15 Definition of Terms Outcome: A measurable result or benefit that you want to achieve, usually to change some condition in a particular population.
AKA outcome objective, outcome goal, results indicator
These outcomes will assist in meeting objectives that support your goals.
Examples: (Hint: start with an active verb)
Reduce school absenteeism (in children with asthma)
Develop a strategic plan
16. 16 Definition of Terms Indicator or Measure: A descriptor that tells what is being measured (indicator) & how you’re measuring it (measure) to determine whether the outcome has been achieved.
AKA outcome indicator, performance measure
Examples:
Number of days absent (due to asthma-related illness)—a number
Percent reduction in days absent—a percent
Completed strategic plan—a milestone (yes/no)
17. 17 Definition of Terms Target or Benchmark: Specifies how much change to how many participants in what time period.
AKA performance standard, performance target or goal
To establish a target/benchmark, your guideline should be your previous experience (baseline), experience of others, a standard in the field, or your best estimate.
Examples:
20 (days absent; baseline previous year was 30 days)
33% (reduction in absenteeism; from 30 to 20 days)
True/yes (strategic plan completed)
18. 18 Definition of Terms Baseline: This is your starting point or previous experience with this indicator. If you have no previous experience, you can use the experience of others or what research suggests.
The target is often set in relation to this baseline, in which case the results will be compared to the target & to the baseline.
Examples:
Grades at entry in the program
Pre-test scores on life skills inventory
Last year’s participation rate or attendance
19. 19 Definition of Terms Conditions or Limitations: The criteria for inclusion in the group that further describes or defines the population or project being evaluated. It may also include the typical size of the group or expected dates of completion.
Examples:
Includes all clients provided services during the program year (estimated 2,000)
Includes only children with asthma enrolled for at least 6 months
Includes all women over 18 years with at least one well-woman visit
20. 20 Summary Examples Program: Asthma case management
Outcome: Reduce school absenteeism
Indicator: Percent reduction in days absent
Target/benchmark: 33% (to 20 days)
Baseline: 30 days last year
Conditions: Children grades 1-3 enrolled 6 months
Program: Capacity-building
Outcome: Develop a strategic plan
Indicator: Completed strategic plan
Target/benchmark: True/Yes
Baseline: Consultant to be hired by 12/31/06;
Conditions: Plan to be completed by 6/30/07
(see related handout)
21. 21 Making the Connections: Evaluation Framework
One method of organizing the elements of program planning is to describe the underlying assumptions on which the program is based—an if-then message of what is intended.
For example:
If free immunizations are provided at convenient times & places, then parents are more likely to get their children fully immunized by age two.
These underlying assumptions are the theory (of action or change) behind the program activities & strategies.
When this underlying theory or conceptual framework is supported by research documenting its success and effectiveness (it works!), it may become a best practice or evidence-based practice.
22. 22 Case Study: Youth ViolenceTheoretical Model:Social Ecology/Social Development Domains
23. 23 Making the Connections: Evaluation Framework This framework helps make the connections among:
Needs ? Goals ? Objectives ? Activities
Activities are designed to achieve established objectives, which support goals that address identified needs
Logic models are popular tools that can be used to represent these relationships both graphically & in narrative form
24. 24 Making the Connections: Evaluation Framework (Taylor-Powell, University of Wisconsin-Extension, 2003)
Where are you going?
How will you get there?
What will tell you that you’ve arrived?
A logic model is your program
ROAD MAP
25. 25 Making the Connections: Evaluation Framework Logic models are the foundation of evaluation planning. Looking at your logic model, you will find questions about your program that you hope to answer. The purpose of evaluation planning is to identify these questions & plan a way to find the answers.
Two major types of evaluation help answer these questions.
Implementation or Process Evaluation: Are you performing the services or activities as planned? Are you reaching the intended target population? Are you reaching the intended number of participants? Is it leading to the products you expected? How do the participants perceive these services and activities? These questions are about implementation.
Outcomes Evaluation: Is your target audience experiencing the changes in knowledge, attitudes, behaviors, or awareness that you sought? What are the results of your work? What is it accomplishing among your target audience? These questions are about outcomes.
(Innovation Network, Evaluation Plan Workbook, 2005)
26. 26 Implementation Issues
“Systems trump programs…”
Patrick McCarthy Annie E. Casey Foundation 2002
27. 27 Best Practices
Effective outcomes require both effective intervention & effective implementation (that is, a program or strategy that works & a working program)
Evaluation of both implementation process and client-level outcomes is necessary
Best-practices are only meaningful when there is evidence of intended results
28. 28 Making the Connections, Logically(Innovation Network, Logic Model Workbook, 2005)
29. 29
30. 30 Chain of Outcomes(Innovation Network)
31. 31
32. 32
33. 33 Making the Connections: Evaluation Framework
34. 34 Evaluation Resources Need an evaluation tutorial?
Try CSAP’s Prevention Pathways, free online courses: Evaluation for the Unevaluated (Program Evaluation 101 & 102). See http://pathwayscourses.samhsa.gov/index.htm.
Evaluation Process & Case Study
American Academy of Pediatrics (2006). Evaluating Your Community Program: Part 1. Designing Your Evaluation. (www.aap.org/commpeds/htpcp)
35. 35 References & Resources Continuous Quality Improvement Framework: National Child Welfare Resource Center for Organizational Improvement & Casey Family Programs (2005) (http://muskie,usm.maine.edu/helpkids/rcpdfs/CQIframework.pdf)
Innovation Network (www.innonet.org).
Point K Learning Center (2005). The Big Picture. See website for: Organizational Assessment Tool; Logic Model Workbook; Evaluation Plan Workbook
TCC Group Briefing Papers (www.tccgrp.com):
Connolly, PM (2007). Deeper Capacity Building for Greater Impact.
York, PJ (2003). Learning as We Go: Making Evaluation Work for Everyone. (Handout: Exhibit B. The Evaluative Learning Continuum)
*United Way Services of Greater Richmond (www.yourunitedway.org):
A Guide to Developing an Outcome Logic Model & Measurement Plan. (Powerpoint)
University of Wisconsin-Extension (www.uwex.edu/ces/lmcourse):
Taylor-Powell, E. (2003). Logic Models to Enhance Program Performance.
Va. Dept. of Education (www.pen.k12.va.us). (2002) Planning for Results Manual.
36. 36 Thank You!
Barbara H. Dunn, PhD, RN
Consultant, Health & Human Services;
Associate Professor & Acting Director,
Community Nursing Organization,
VCU School of Nursing
Emails: bhdunn114@comcast.net; bhdunn@vcu.edu
Phone #804-330-8906 (home office);
#804-828-2011 (VCU)