200 likes | 205 Views
This article discusses mainstream and emerging evaluation approaches, their assumptions, and effectiveness in measuring the impact of interventions. It also provides examples of evaluating goals and offers strategies to add value to your work while ensuring donor satisfaction.
E N D
Mainstream and Emerging Evaluation Approaches(and how to stay out of trouble with the donors)by Ellen Sprenger AWID forum Nov 16th, 2008
Evaluation = “measuring and reflecting on the effectiveness of an intervention” Evaluative questions: • Am I realizing my goals? • Are my assumptions about the path to realizing my goals correct? (= theory of change)
Examples: Goal 1:Build new bridge to strengthen municipal economy Goal 2: Promote a feminist vision and practice of leadership
Mainstream approaches Are based on the following assumptions on how change happens: • The world is logical (“smart people can figure it out”) • Change is linear (cause and effect, predictable, controlled) • Evaluation is about gathering value free, objective information • Change can be attributed to one intervention, actor
Mainstream approaches work very well when: • Dealing with a technical problem • The problem is well defined • The answer is known (or can be known) • Implementation is clear • Solutions can be imposed by one organization / entity
Log Frame example 1: build new bridge to solve traffic problems
Emerging approaches Are based on the following assumptions: • The world is political (“change is about transforming power relations”) • Change is complex, multi-dimensional • Evaluation is about learning what works and what doesn’t • Change is the result of many different actions, actors and circumstances (“contributions to change”)
Emerging approaches work very well when: • Dealing with adaptive challenges (i.e.when people need to change!) • Challenges are complex • Answers are not known • Implementation requires learning (changes in values, behavior)
Where things go wrong If a mainstream / technical approach is applied to an “adaptive challenge”: • rigid frameworks take over • unrealistic claims are made • groups compete for impact: individual organizations taking credit for collaborative work (“attribution versus contribution”) • limited learning takes place • Evaluation becomes a chore, a burden
The second generation log-frame provides more space.. • Quantitative and qualitative indicators • Focus on outcome areas that your organization can influence • Talk about “probability of impact” including other factors that contribute to impact: • Other actors, including collaboration partners • Assumptions • Risk analysis
The good news is that the evaluation field is changing… • Many more ‘alternative’ frameworks are becoming available • A growing number of women’s rights organizations are starting to figure it out • Donors are more open to alternatives (“logframe is the best of a bunch of bad options available”, in The Use and Abuse of Logframe, SIDA 2005)
How to evaluate in ways that adds value to your work AND keeps you out of trouble with your donors??
First.. • Remember that donors too are not completely sure what they are doing • Develop your own theory of change Next? • Develop or use an evaluation framework that FITS with your theory of change • If you have to use log-frame…use it creatively • Invest in organizational capacities And… • Educate your donors • Own your way of doing things and push back!