470 likes | 576 Views
Evaluation 101. Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013. Coalition. A group of individuals representing diverse organizations or constituencies who agree to work together to achieve a common GOAL
E N D
Evaluation 101 Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013
Coalition A group of individuals representing diverse organizations or constituencies who agree to work together to achieve a common GOAL - Feighery& Rogers, 1990
Evaluation is . . . - the systematic collection of information . . . to reduce uncertainties, improve effectiveness, and make decisions (Michael Q. Patton, 1988)
Why Evaluate? Provide Accountability to Community, Funders & Stakeholders Quality Efficiency Effectiveness Effectiveness Effectiveness Effectiveness
Why Evaluate? What gets measured, gets done If you don’t measure results, you can’t tell success from failure If you can’t see success, you can’t reward it If you can’t reward success, you’re probably rewarding failure If you can’t see success, you can’t learn from it If you can’t recognize failure, you can’t correct it Adapted from: Reinventing Government, Osborne and Gaebler, 1992
Why Evaluate? • Monitor overall progress toward goals • Determine whether individual interventions are producing the desired progress • Permit comparisons among groups • Continuous quality improvement • Ensure only effective programs are maintained • Justify the need for further funding
Types of Evaluation infrastructure, functions and procedures extent of implementation realization of vision
Collective Impact Model • Common agenda - a vision • Shared measurement system • Mutually reinforcing activities • Continuous communication • Backbone support organization
Vision, Stakeholder Engagement Implementation of Mutually Reinforcing Activities Collective IMPACT
Formative Evaluation • Why is the collaboration needed? • Do we have the resources needed? • Do we have strong leadership? • Are the right stakeholders represented? • Is a collaboration the best approach? • Is there a shared vision?
Process Evaluation STRATEGIES • Are you implementing things as planned? • Are you reaching the target population? • Are you implementing with quality? • How many are you reaching? COALITION • Are the right people participating? • Are meetings productive? • Are workgroup charges clear • Is the work beginning?
Outcome Evaluation • What has changed or improved? • Are we achieving our intended goals? • Was the effort worth the time & costs? ShortIntermediateLong Term
Evaluating a Coalition Can be tricky • need to evaluate my own accomplishments without undermining success of the whole • all in this together but how do we distinguish the contribution of one agency or one stakeholder from another
CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data
CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data
Engaging Stakeholders • Include Stakeholders who are: • Implementers • Partners • Participants – those affected • Decision-makers • Establish evaluation team at onset with areas for stakeholder input • Obtain buy-in & commitment to plan
CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data
Set Goals and Develop Plan • Problem Statement - define the need Who? Where? Why? • Envision the Future - Set Your Goals • Set the Context • Select Strategies and Set Targets • Connect the Dots . . . create a Logic Model
Logic Models A logic model is a road map for the shared work of all of the stakeholders... it answers the questions: • Where are we now? • Where are we going? • How will we get there?
A Logic Model . . . Can it get any simpler? Needs and Strengths Strategies Outcomes
Strategies Resources Outcomes Activities Outputs Short Medium Long-term Needs & Strengths Our results What we do Who we reach Where we are Logical Chain of Connections University of Wisconsin-Extension, Program Development and Evaluation
Detailed logic model University of Wisconsin-Extension, Program Development and Evaluation
CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data
Focus the Evaluation Design What do we want to know? • Coalition • Programs • Participants • Outcomes • Coalition impact • Influencing factors
Develop Indicators • What Will Change? • For Who? • By How Much? • By When? • Indicators for Activities - process indicators • Indicators for Outcomes - outcome indicators • There can be more than one indicator for each activity or outcome
Coalition Evaluation Questions • Are we meeting our members’ needs? • Do our work groups function well? • Have we improved community awareness? • Are we influencing policies & practices? • Are we building organizational/community capacity? • Are we building strategic partnerships? • Have we strengthened our base of support? • Are we reaching our priority audiences? • Which strategies are effective? • Are we making a difference?
CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data
Choose Methods and Collect Data • Collect enough data to be reliable, but consider burden • Consider existing data sources • Don’t try to measure everything • Use mixed methods • Qualitative& Quantitative
Methods • Focus Groups • Interviews • Structured Observations • Document/ Record Review • Case Studies • Surveys • Participant Assessments • Statistical Analysis of program data • Cost-Benefit Analysis
CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data
Analyze Data& Interpret Results • Organize and classify the data • Tabulate – counts, numbers, descriptive statistics • Identify Themes • Stratify – look at data by variables/demographics • Make Comparisons • pre-post • between groups • Present Data in clear format – use narratives, charts, tables, graphs, maps
CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data
Ensure Use & Share Lessons Learned • Recommendations • Preparation - Engage & Guide Stakeholders • Feedback • Follow-up • Dissemination
The only man who behaves sensibly is my tailor; he takes my measurements anew every time he sees me, while all the rest go on with their old measurements and expect me to fit them. - George Bernard Shaw
RESOURCES Full Resource List on Literacy Powerline Website University of Wisconsin – Extension www.uwex.edu/ces/pdande U.S. Dept. HHS CDC - Strategy & Innovation www.cdc.gov/eval/guide/CDCEvalManual.pdf Annie E. Casey Foundation www.aecf.org Two reports by Organizational Research Services • A Practical Guide to Documenting Influence and Leverage • A Guide to Measuring Advocacy and Policy.
For more information… Literacy Powerline www.literacypowerline.com Apter & O’Connor Associates www.apteroconnor.com dianne@apteroconnor.com cynthia@apteroconnor.com 315-427-5747 • For more information…
Research vs. Evaluation Research seeks to prove • Investigator-controlled • Authoritative • Scientific method – isolate /control variables • Limited number of Sources - accuracy • Facts – descriptions, associations, effects Evaluation seeks to improve • Stakeholder-controlled • Collaborative • Incorporate variables -account for circumstances • Multiple Sources - triangulation • Values – quality, value, importance