310 likes | 584 Views
What is Impact Evaluation … and How Do We Use It ?. Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education Impact Evaluation (APEIE) Accra, Ghana May 10-14 2010. Some examples.
E N D
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education Impact Evaluation (APEIE) Accra, Ghana May 10-14 2010
Some examples • Should my government distribute free textbooks to students to promote learning? • Should we distribute scholarships to poor children to promote attendance? • Should teachers be rewarded for learning improvements of their students? • Should management decisions be devolved to the school level? Impact evaluation is a way to start answering those questions using rigorous evidence
What is Impact Evaluation? Example • You would like to distribute textbooks to students as a way of improving learning outcomes • Your intuition tells you that textbooks should matter. • But what is that intuition based on? • Own experience • “Common sense” • Observing children in schools • Comparisons between children in schools with textbooks and in those without • Impact evaluation, in this situation, would aim at providing • rigorous evidence, • based on actual experience, • of what the actual impact of providing textbooks is.
What is Impact Evaluation? • How would impact evaluation achieve this aim? • By establishing the causal impact of textbooks on learning outcomes • This is the ultimate goal of impact evaluation. • This workshop will be about • What makes for a good estimate • How to estimate that impact • How to interpret the estimate
Why do we use impact evaluation? • Understand if policies work • Might an intervention work (“proof of concept”)? • Can an intervention be done on a large scale? • What are alternative interventions to achieve a particular goal, and how do they compare?
Why do we use impact evaluation? • Understand the net benefits of the program, and cost-effectiveness of alternatives • Requires good cost and benefit data • Understand the distribution of gains and losses • Budget constraints force selectivity • Bad policies and programs are wasteful and can be hurtful
Why do we use impact evaluation? • Demonstrate to politicians, population, donors that a program is effective • This can be key to sustainability • Informs beliefs and expectations
Putting impact evaluation in context Impact evaluation Evaluation Monitoring Analytical efforts to relate cause and effect. Key part is establishing “what would have happened in the absence of the intervention” Analytical efforts to answer specific questions about performance of a program/activities. Regular collection and reporting of information to track whether actual results are being achieved as planned
Monitoring, Evaluation, and Impact Evaluation • Periodically collect data on the indicators and compare actual results with targets • To identify bottle-necks and red flags (time-lags, fund flows) • Point to what should be further investigated Monitoring Regular collection and reporting of information to track whether actual results are being achieved as planned
Monitoring, Evaluation, and Impact Evaluation • Analyzes why intended results were or were not achieved • Explores targeting effectiveness • Explores unintended results • Provides lessons learned and recommendations for improvement Evaluation Analytical efforts to answer specific questions about performance of a program/activities.
Monitoring, Evaluation, and Impact Evaluation Impact evaluation • What is effect of program on outcomes? • How much better off are beneficiaries because of the intervention? • How would outcomes change under alternative program designs? • Does the program impact people differently (e.g. females, poor, minorities) • Is the program cost-effective? Analytical efforts to relate cause and effect. Key part is establishing “what would have happened in the absence of the intervention”
The central problem in Impact Evaluation Analysis: The counterfactual • In order to establish the impact of the program, we need to know what would have happened in the absence of the program • Not in general, but specifically for the people who actually received the program
The central problem in Impact Evaluation Analysis: The counterfactual • What is the effect of a scholarship on school enrollment? • We want to observe the units of treatment in two states • What’s wrong with this picture? Elizabeth on 1 July 2010 with scholarship Elizabeth on 1 July 2010 without scholarship
The central problem in Impact Evaluation Analysis: The counterfactual • This is impossible! • we never observe the same individual with and without program at same point in time • The counterfactual is never actually observed • It needs to be estimated • Impact Evaluation Analysis is all about alternative approaches to estimating the counterfactual
Why is the counterfactual important? • The next session will discuss the counterfactual in more detail • Here, just one illustration
Illustration of the importance of the counterfactual • Question: What is the best estimate of the impact of the program on enrollment? Enrollment Before After Impact? A B Time 2008 2010 Program
Illustration of the importance of the counterfactual • Question: What is the best estimate of the impact of the program on enrollment? Enrollment Before After Impact? A D? B C? Time 2008 2010 Program
What is impact evaluation? • Impact is … the difference between outcomes with the program and without it • Impact evaluation involves … estimating the counterfactual so that changes in outcomes can be attributed to the program
What is involved in implementing an impact evaluation? • Determine whyan IE is called for • Understand program and its results chain • Determine whatto measure • Determine methodology • Carry out • Datacollection (baseline, follow up) • Program implementation • Analysis and reporting • Adjust policy
Determine why the evaluation called for • Specific intervention • Cash transfer to specific students • Specific teacher training program with a particular curriculum • School grant program with a particular structure • Alternative interventions/complementary interventions • Teacher training versus school grants as a way to improve outcomes • Information and school grants as a way to boost school accountability and performance • Entire program/cluster of activities • Reform program What is the audience (policymakers, technocrats, public at large)?
Outcomes Long-term Results Inputs Activities Outputs (What the program does) (Goods & Services) • Teachers’ training • School councils established • Conditional Cash transfers • Salary incentives • Increased enrollment • Lower teachers’ absenteeism • Higher primary education completion rates • Higher student learning achievements • Lower unemployment • Poverty reduction • Better income distribution • Teachers • Text books • Grants Understand the program using the results chain This results chain provides guidance on what to measure
Determine what to measure • Based on the results chain • Choose indicators for the evaluation • Carefully defined indicators • Can be measured in a precise way • Are expected to be affected by the program… • … within the timeframe of the evaluation • What are the important sub-populations • E.g. age, gender, urban/rural, SES…
Outcomes Long-term Results Inputs Activities Outputs (What the program does) (Goods & Services) • Teachers’ training • School councils established • Conditional Cash transfers • Salary incentives • Increased enrollment • Lower teachers’ absenteeism • Higher primary education completion rates • Higher student learning achievements • Lower unemployment • Poverty reduction • Better income distribution • Teachers • Text books • Grants Determine indicators using the results chain Indicators related to the implementation of a program
Outcomes Long-term Results Inputs Activities Outputs (What the program does) (Goods & Services) • Teachers’ training • School councils established • Conditional Cash transfers • Salary incentives • Increased enrollment • Lower teachers’ absenteeism • Higher primary education completion rates • Higher student learning achievements • Lower unemployment • Poverty reduction • Better income distribution • Teachers • Text books • Grants Determine indicators using the results chain Indicators related to the results of a program
Determine the methodology • We’ll be talking a lot more about this • Experimental methods • Quasi-experimental methods • Some principles: • Prefer method that complements program best • Prefer method that does not alter program design or implementation substantively • Prefer method that does not deny anyone benefits
The IE “cycle” Design phase Baseline data collection Program implementation Follow-up data collection Program adjust-ements Data analysis and reporting
The goal of impact evaluation • To improvepolicies • For example, to find out how to turn this teacher…
The goal of impact evaluation • To improvepolicies • …into this teacher