330 likes | 345 Views
Understand how evaluation helps improve program effectiveness, informs decision-making, and supports accountability. Learn to focus on implementation and outcomes through logic models. Explore principles and methods for evaluating activities, outputs, and outcomes to enhance program impact and communicate results effectively.
E N D
Session Purpose • To understand how evaluation can be useful • To understand how your logic model helps to focus an evaluation • To understand both implementation and outcome evaluation
What is Evaluation? The systematic collection of information about a program in order to enable stakeholders to better understand the program, to improve program effectiveness, and/or to make decisions about future programming.
What’s in it for you? • Understand and improve your program • Test the theory underlying your program • Tell your program’s story • Be accountable • Inform the field • Support fundraising efforts
Evaluation Principles Evaluation is most effective when it: • Is connected to program planning and delivery • Involves the participation of stakeholders • Supports an organization’s capacity to learn and reflect • Respects the community served by the program • Enables the collection of the most information with the least effort
Logic Model Program Goals: overall aims or intended impacts Resources The resources dedicated to or consumed by the program Activities The actions that the program takes to achieve desired outcomes Outputs The tangible, direct results of a program’s activities Outcomes The benefits to clients, communities, systems, or organizations External Factors: what else affects the program
Putting Your Plans Together Logic Model Resources Activities Outputs Outcomes Evaluation Plan: Implementation Data Collection Method Effort Activities Outputs Outcomes Data Collection Method Effort Outcomes Indicators
Implementation and Outcomes • Evaluating Outcomes: What changes occurred as a result of your work? • Evaluating Implementation: What did you do? How well did you do it?
Evaluating Outcomes • Outcomes: the changes you expect to see as a result of your work • Indicators: the specific, measurable characteristics or changes that represent achievement of an outcome. They answer the question: How will I know it?
Evaluating Outcomes:Indicators = What to Measure • Meaningful • Direct • Useful • Practical
Evaluating Implementation • Activities and Outputs: The “what” —the work you did, and the tangible results of that work • Additional Questions: The “why”—understanding how well you did, and why
Evaluating Implementation: Understanding how well you did What information will help you understand your program implementation? Think about: • Participation • Quality • Satisfaction • Context
Data Collection Determine what methods will you use to collect the information you need? • Choose the method • Decide which people, or records will be the source of the information • Determine the level of effort involved in using that method with that population.
Data Collection Methods • Review documents • Observe • Talk to people • Collect written information • Pictorial/multimedia
Issues to Consider • Resist pressure to “prove” • Start with what you already collect • Consider the level of effort it will take to gather the data. • Prioritize. What do you need to collect now, and what can wait until later?
Data Collection: Level of Effort • Instrument development • Cost/practicality of actually collecting data • Cost of analyzing and presenting data
Qualitative Data • Usually in narrative form—not using numbers • Collected through focus groups, interviews, open-ended questionnaire items, but also poetry, stories, diaries, and notes from observations
Quantitative Data • Pieces of information that can be expressed in numerical terms, counted, or compared on a scale • Collected in surveys, attendance logs, etc.
Both Types of Data are Valuable • Qualitative information can provide depth and insight about quantitative data • Some information can only be collected and communicated qualitatively • Both methods require a systematic approach
What do your data tell you? • Are there patterns that emerge? • Patterns for sub-groups of your population? • Patterns for different components of your program? • What questions do the data raise? • What is surprising? What stands out? • What are other ways the data should be analyzed? • What additional information do you need to collect?
Communicating Findings • Who is the information for? • How will you tell them what you know?
“Information that is not effectively shared with others will not be effectively used.” Source: Building a Successful Evaluation Center for Substance Abuse Prevention Communicating Findings
Staff Board Funders Partners Other agencies Public Audience: Who needs the findings, and what do they need? Who are the audiences for your results? Which results?
Written report Short summaries Film or videotape Pictures, displays PowerPoint presentations Graphs and other visuals Different ways to communicate Decide what format is appropriate for different audiences.
Whatever communication strategy you choose: • Link the findings to the program’s desired outcomes • Include the “good, the bad, and the ugly” • Be sure you can support your claims • Acknowledge knowledge gaps
Continuous Learning Cycle Logic Model Reflection/ Improvement Evaluation Planning Data Collection
Thanks for Your Participation! Measure results. Make informed decisions. Create lasting change. Innovation Network, Inc. 1625 K Street, NW 11th Floor Washington, DC 20006 (202) 728-0727 Website: www.innonet.org Robin: Extension 104; rkane@innonet.org Veena: Extension 107; vpankaj@innonet.org