530 likes | 553 Views
Join us on January 23, 2014, for a training session on measuring results in school-based programs hosted by Communities in Schools Chicago. Learn how to design relevant programs, measure outcomes, and communicate results to donors for improved program effectiveness. By the end of this training, participants will be able to apply a results framework to program design, identify components of an M&E plan, and link indicators to design frameworks. Discover the importance of measurement in decision-making and donor relations, and develop a results framework that conveys logic to donors. Don't miss the chance to enhance your skills in measuring and evaluating program outcomes!
E N D
Measuring Results in School-based ProgrammingJanuary 23, 2014Host: Communities in Schools ChicagoFacilitators: Emily Ardell, Joanna Cohen, Sydney Bern-Story Heartland Alliance International
Training Agenda • Morning: • Introductions • Designing Relevant Programs • Laying the Foundation for Measuring Outcomes • Break for Lunch • Afternoon: • Why Measurement Matters • Strategies for Measuring Outcomes • Communicating Outcomes to Donors • Wrap-up
Training Overview By the end of this training, you will be able to: Apply a results framework to program design Identify the main components of an M&E plan Understand how to link indicators to design frameworks Describe how information can be used for decision-making and donor relations
What do these three things have in common? Moon landing iPhone Vaccines
First Steps • Understand your problem • Tool: problem tree analysis • Brainstorm solutions • Generate lots of ideas • Defer judgment • Embrace crazy, off-the wall ideas • Be visual • Stay on topic • Build on others’ ideas
Your Challenge • Create a BASIC problem tree • Brainstorm as many solutions as you can
Make Your Mark • Consider organization’s mission/vision, resources available, strategy, multiplier effect, sustainability • MEASUREMENT STARTS NOW: Put it together in a way that makes sense and lends itself to measurement
How do we convey our logic to our donors? • Need to determine sequence of steps need to achieve objectives in measurable terms • Sequence of steps portrayed in a visual summary of project design that clearly outlines logical flow • Why does this matter? Do donors really care?
Enter the Results Framework • Communicates project components • Improves project monitoring and reporting • Strengthens project implementation
Building a Results Framework HOW WHY Goal SO 1 SO 2 IR 1.1 IR 1.2 IR 2.1 IR 2.2 IR 2.3 Activity 1.1.1 Activity 1.2.1 Activity 2.1.1 Activity 2.2.1 Activity 2.3.1 Activity 1.1.2 Activity 1.2.2 Activity 2.1.2 Activity 2.3.2 Activity 2.3.2 Activity 1.1.3 Activity 2.1.3 Activity 2.3.3
Components of the Results Framework Goal: broad statement about the long-term outcome to which the project will contribute Strategic Objectives (SO): desired outcome that will be accomplished during the life of the project Intermediate Results (IR): expected changes as a result of activities that is necessary to achieve a SO Activities: hands-on events that occur during the life of the project and are necessary to achieve an IR
Tips for strong SOs and IRs Emphasize what you’re going to change; this is the heart of your strategy Make sure the change is big enough to matter Agents of change should be the subject of the sentence Every statement should have ONE focusUse strong action verbs
Is it SMART? NO. Go back to the drawing board. • Ex. Students increase their attendance record at school. No, more clarification is needed to know target population. • Specific?: • Measurable?: • Appropriate?: • Realistic?: • Timely?: Yes, attendance is measurable. Unknown, more information is needed about the program’s goal. Yes, because attendance is already being tracked in schools. No, the time within which the objective is to be achieved is not specified.
A few final thoughts on results frameworks: • Prepare for the worst • Expect an iterative process • Keep measurement in mind • The time investment is worth it, we promise!
When you read that “the incidence of violence in a school has reduced by 20%”, have you ever wondered how this calculation was derived? Or when you hear that “the percentage of teenage girls of reproductive age in a school district who are using a modern contraceptive method rose from 52% to 73% in 2013”, do you wonder how people know this? These types of statistics result from “monitoring & evaluation”efforts.
Monitoring & Evaluation (M&E) The process by which data are collected and analyzed in order to provide information to program staff, policy makers, and others for use in program planning and project management.
M&E – “Monitoring” • Ongoing collection of data • Ex. Number of in-school trainings • Tracks changes over time • Ex. Number of students who visit museums in 2013 vs. 2014 • Helps with planning • Actual vs. Planned targets • Facilitates regular reporting and shared accountability • Delays, accomplishments, clarifications
Project End 50 400% increase in the number of students trained in mediation, leadership and advocacy Year 1 1.5 2 2.5 3 Project start Time Monitoring in Action Graph 1. # of students trained in mediation, leadership and advocacy # of students 250 200 150 100
M&E – “Evaluation” • Periodic collection of data • Ex. beginning, middle, and end of project • Measures the extent to which the project has met expectations • Ex. Exceeded project target • Facilitates learning • What works and what doesn’t?
Why is M&E important? • Enables informed decision-making • Ex. Which program activities were more effective and which were less effective? What adjustments need to be made, if any? • Ensures effective and efficient use of resources • Ex. Was the program implemented as planned? • Meets organization reporting and other requirements • Ex. Did the target population benefit from the program, and at what cost? • Convinces donors their investments are worth it! • Ex. What evidence do you have that the program works?
Developing an M&E plan • What, how, when • Usually includes an outline of the project components, indicators, baseline numbers, target numbers, data sources, and frequency of data collection and/or reporting • Visually communicates an outline of project • Guides the implementation of M&E activities in a standardized and coordinated way • Provides accountability and institutional memory
Indicators • Variables that measure one aspect of a program • Only as good as one’s ability to measure them accurately and consistently • Can be either quantitative or qualitative
Types of Measurement • Quantitative:measures of quantities or amounts • Ex. % of students that implement community-based projects following training • Qualitative:descriptive observations usually related to people’s knowledge, behavior, or attitudes • Ex. Presence of social cohesion in classroom • Can also be quantified
Types of Indicators • Impact: changes in circumstances proven to be caused by a program; counterfactual necessary • Outcome:changes in knowledge, attitudes, or behaviors • Ex. % of students ages 12-14 that practice healthy eating habits, as compared to baseline • Output:immediate results of a program’s activities • Ex. # of students ages 9-10 trained in karate
Identifying Indicators • Organizational standards • Industry standards • Resource: www.performwell.org • Build your own
Developing Good Indicators • Show what is going to change • Identify target population • Reflect changes in the condition over time (where possible)
Developing Good Indicators • Make them SMART • Determine how you’re going to measure it • the calculation or formula on which the indicator is based • Ex. % of CIS schools that score at least 85% on knowledge survey • Clarify poorly defined terms • Ex. number of students trained
Setting Project Targets • Research baseline (primary or secondary data) • Focus on what the program should achieve; be ambitious, but realistic • How much change is enough? What is success? • Orient stakeholders to the task to be accomplished • Motivate & Monitor
Indicator Exercise DIRECTIONS:Develop at least 1 outcome indicator for your SO or IR, and an output indicator for each activity that you have listed.
Data Sources & Collection Methods • Data sources are the resources used to obtain data for M&E activities • Can be pre-existing or you can build your own • Resource: www.performwell.org • Quality vs. feasibility • Use mixed sources (quantitative & qualitative) whenever possible
Examples of Data Collection Methods and Tools Quantitative Checklist Testing Questionnaire (enumerated and self-administered) Survey Qualitative • Direct Observation • Document Review • Focus Group Discussions • Interviews (semi-structured and structured) • Stories of change
Types of Information Use • Program Management: inform decisions to guide and improve ongoing project implementation • Learning and knowledge-sharing: advance organizational learning and knowledge-sharing for future programming, both within and external to the organization • Accountability and Compliance: demonstrating how and what work has been completed, and whether it was according to any specific donor, organizational or international standards • Business Development and Advocacy: highlight and promote accomplishments and achievements to mobilize resources and policy initiatives
Pop quiz: Which of the following statements packs more punch? • Our programs improve the lives of school-aged adolescents • 95% of the youth that have participated in our program demonstrate increased self-esteem and an improved ability to protect themselves against bullying by their peers
Donors want proof, and now you’ve got it! • Donors want to see their money to support reach change • Impact, impact, impact (outcome, outcome, outcome!) • Information has been gathered by monitoring and tracking progress against indicators • Data has been analyzed • There is both quantitative and qualitative information that serves as clear proof that your intervention “works”
What information is worth sharing? Take cues from your donor: • What interests them most? Choose handful of key indicators that align with their strategic priorities. • What kind of change is most important to them? If they are more interested in numbers reached vs. behavior change, focus on outputs rather than outcomes (or vice versa). • Real proof of effectiveness is rare among non-profits competing for funds – use it to your advantage!
Sharing information on outcomes • In-person meetings to “pitch” a concept, armed with relevant materials • Written proposals and concept notes • Key tips for success • Networking gatherings and public events, with prepared pitch and key data findings
Sharing results on paper • Avoid technical terms that the audience might not understand • Adopt a conversational style, if appropriate • Consider using bullets to break up long sentences • Use boxes, pictures, text boxes, tables, and charts to convey information • Write in active voice and avoid passive language
Develop a Communication and Reporting Strategy Source: Adapted from Torres et al. 2005
Donor communication tips • Know your audience • Remain timely and frequent • Variety is the spice of life • Keep it simple and clear