610 likes | 735 Views
Evaluating the Initiative. Community Tool Box Curriculum Module 12. What have been your previous experiences in evaluation? Is your partnership or coalition currently undergoing any type of evaluation?. Concepts and Attributes of Evaluation.
E N D
Evaluating the Initiative Community Tool Box Curriculum Module 12
What have been your previous experiences in evaluation?Is your partnership or coalition currently undergoing any type of evaluation?
Concepts and Attributes of Evaluation Evaluation: Systematic investigation of the merit, worth, or significance of an object or effort
Concepts and Attributes of Evaluation Evaluation as: • A Process: ongoing, interactive (repeating) • A Product: information used for continuous improvement
Why an Evaluation? • Help understand how efforts work • Ongoing feedback can improve community work • Gives community members voice and opportunity to improve efforts • Hold accountable those groups doing, supporting, and funding the work
A Bill of Rights and Responsibilities for Community Evaluation • Contribute to understanding • Contribute to improvement • Encourage participation of all stakeholders • Respond to interests of different stakeholders • Capture the dynamic nature of the work • Provide clear and timely information • Be practical, with the benefits outweighing the costs • Do no harm to the community • Help clarify the initiative’s contribution to more distant outcomes • Strengthen capacity
Concepts and Attributes of Evaluation DIALOGUE: Which of these attributes is particularly important to your community initiative? How will this attribute be assured in your evaluation?
Overview of an Evaluation Plan • Identify stakeholders • Describe the program • Focus the evaluation design • Gather credible evidence • Make sense of data and justify conclusions • Use information to celebrate, make adjustments and communicate lessons learned
“We should make things as simple as possible, but not simpler.” -- Albert Einstein
Determining Who Cares and What They Care About Identify Who Cares – Stakeholders: • Those involved in operating the program or initiative • Those served or affected • Primary intended users of the evaluation (e.g., funders, staff, researchers)
DIALOGUE: Who cares about your effort and its effects?
Identify What They Care About: • Different stakeholders may have different perspectives. Ask. What evaluation questions or aspects are of interest to: • Community groups • Grantmakers and funders • Outside researchers
Evaluation Stakeholders & Interests Program or Initiative: ______________________
Developing Evaluation Questions • An evaluation question seeks information stakeholders want to know about the functioning of the program • Examples: • Did the intervention have the desired effect? • With whom? • Under what conditions?
Categories of Evaluation Questions • Process • Planning and implementation issues • Outcome • Attainment of objectives • Impact on participants • Impact on the community
Process Measures Measures for planning and implementation • How important are the goals? • How well was the effort planned? • How well was it implemented? • Who participated? How many? • Did those most affected contribute to planning, implementation, and evaluation? • How satisfied are stakeholders withthe program?
Outcome Measures Attainment of objectives • How well has the program met its stated objectives?
Developing Outcome Measures Impact on participants • How much and what kind of difference has the program made for its targets (e.g., in knowledge, behaviors, outcomes)? • From the perspective of participants (and outside experts), how significant were the effects?
Developing Outcome Measures Impact on the community • How much and what kind of difference has the effort made on the community (e.g., in community and systems changes, rates of behavior, population-level outcomes)? • From the perspective of participants (and outside experts), how significant were the effects? • Were there any unintended consequences? Positive? Negative? • Is the community’s capacity to address problems improved—across issues, over time?
ACTIVITY: What evaluation questions related to process measures or outcome measures is your group interested in?
Evaluation Questions Program or Initiative: _______________________
Developing Meaningful Questions Consider: • What exactly was it that you originally hoped to accomplish? • How did you plan to do so? • What does your strategic plan suggest? • What does your logic model suggest—activities, outputs, and outcomes?
Evaluating the Initiative “Wonder is the beginning of wisdom.” --Anonymous
Gathering Evidence to Address the Evaluation Questions • Evidence—information that could be used to assess the merit or worth of a program • Gathering credible evidence—Overview: • Indicators of Success • Sources of Evidence • Quality of Evidence • Quantity of Evidence • Logistics for Gathering Information
Indicators of Success Translate expected effects into specific measurable units • Examples include: • Program outputs (e.g., services delivered) • Participation rates • Levels of satisfaction • Intervention exposure, or dose • Changes in communities and systems • Changes in behaviors of participants, populations • Population-level outcomes
DIALOGUE: What are some indicators of success for the Austin Healthy STEPS Initiative?
Sources of Evidence • People and Surveys • Documents (e.g., archival data, statistics) • Observation and Experimentation • More than one source helps make evidence more compelling
DIALOGUE: What are some possible sources of evidence for the Austin Healthy STEPS Initiative?
Quality of Evidence • Appropriateness and integrity of information • Reliability • Validity • Relationship to the evaluation questions • High quality data are both accurate and sensitive
Quantity of Evidence • Sufficient data to draw conclusions • Enough to answer evaluation questions • Adequate for data analysis • Enough to draw conclusions • Not more than is necessary
Logistics for Gathering Information • Methods • Timing • Infrastructure • Attention to cultural norms • Guarantee of confidentiality
Selecting Evaluation Methods • Surveys about satisfaction and importance of the effort • Goal attainment reports • Behavioral surveys • Interviews with key participants • Archival records
Selecting Evaluation Methods • Observations • Self-reports, logs or diaries • Documentation systems and analysis of contribution • Community-level indicators of impact • Case studies and experiments
Ongoing Documentation and Evaluation • Document unfolding of intervention: • Community and systems changes (i.e., new or modified programs, policies and practices) • Analysis of contribution to population-level outcomes—Amount by: • Goal • Intensity of behavior change strategy • Duration • Penetration to Targets throughSectors in Places
Assembling Evidence Chart Program or Initiative: ______________________
Gathering Evidence to Address the Evaluation Questions DIALOGUE: What types of evaluation methods would be useful to the STEPS initiative? DIALOGUE: Which of these evaluation questions would be of interest to key stakeholders for your initiative? What are indicators of success? What evaluation methods would you use?
Evaluating the Initiative “The best test … is how well the modeler can answer the questions, ‘What do you know now that you did not know before?’ and ‘How can you find out if it is true?’” --James M. Brower
Using Evaluation Data toLearn and Make Adjustments Making Sense of the Data and Justifying Conclusions • Standards • Analysis and Synthesis • Sense Making and Interpretation • Judgments • Recommendations
Using Evaluation Data toLearn and Make Adjustments DIALOGUE: How will the Austin Healthy STEPS group use the information to make judgments about the merit of the effort?
Some Ways to Use the Information • Celebrate accomplishments • Make adjustments • Communicate lessons learned
Using the Information • Plan for use • Consider the actual evaluation results • Encourage use of learning from the evaluation process
Keeping Your “Eyes On The Prize” • Ultimate goal is to improve outcomes • May take too long to be useful • Crucial to document intermediate outcomes to support movement toward more distant outcomes
Using Evaluation Data toLearn and Make Adjustments DIALOGUE: How might the STEPS initiative use their findings to learn and make adjustments? ACTIVITY: Who might use the evaluation findings to learn and make adjustments to your program? Use the following chart to map out the key questions you are going to ask, potential findings, and related recommendations.
Evaluation Questions of Interest to Key Stakeholders (e.g., Did the program result in behavior change? Do people like the program?) Actual (Potential) Findings and Key Conclusions (e.g., The results showed that there was only a slight increase in… The findings suggest that …) Actual (potential) Recommendations (e.g., Consistent with these findings, we recommend that… including…) Using an Evaluation Information Chart
Communicating the Findings to Relevant Audiences How, when, and what to communicate from an evaluation is a strategic decision.
Sharing Lessons Learned Strategy Issues • Timing • Style • Tone • Message Source • Vehicle • Format of the information products
Communicating the Findings to Relevant Audiences Broadening Your Dissemination • Formal evaluation • Methods for delivering evaluation data: • Press releases • Internet distribution, web pages, links • Storytelling • Presentations and publications • Consulting by group members or teams • Use multiple and diverse methods
ACTIVITY: Use the chart on the next slide to identify the message, audience, plan and channel to communicate the findings of your evaluation.
The Message - Actual (Potential) Findings Plus Recommendations Communications Plan AUDIENCE Who should receive this message? SOURCE Who should deliver it? CHANNEL How should it be delivered? (e.g., personal contact, written report, media, professional presentation or publication) Communicating Results (e.g., The results show… The findings suggest… We recommend…)