300 likes | 538 Views
Issues in impact evaluation in design. Howard White International Initiative for Impact Evaluation (3ie). Impact evaluation. An impact evaluation seeks to attribute all, or part, of the observed change in outcomes to a specific intervention. Impact in the log frame: basic education.
E N D
Issues in impact evaluation in design Howard White International Initiative for Impact Evaluation (3ie)
Impact evaluation An impact evaluation seeks to attribute all, or part, of the observed change in outcomes to a specific intervention.
So what is impact? • Focus on final welfare outcomes, e.g. • Infant mortality • Income poverty • Security • Usually long-term, but not necessarily so (but then sustainability is an issue)
What sort of things can we do impact analysis on? • Projects (or specific interventions) • Individual projects are the ‘back bone’ of impact analysis • But even then may only be able to do rigorous impact analysis of some components • Programmes • Sector wide programs can be conceived of as supporting a range of interventions, many of which can be subject to rigorous impact evaluation. • Policies • In general different approaches are required, such as CGEs – these are not being discussed today
engage Pick a named intervention for an impact evaluation and make a short list of indicators (using the log frame) for evaluation of this intervention
What do we need to measure impact? Girl’s secondary enrolment
Post-treatment control comparison But we don’t know if they were similar before… though there are ways of doing this
Before versus after comparison Sometimes this can work … but usually not
THE IMPORTANCE OF BASELINE DATA • Ex ante design preferred to ex post: impact evaluation design is much stronger if baseline data are available (but may still be possible even if they are not) • Means collecting data before intervention starts, and can be affecting the design of the intervention • But can sometimes use secondary data, that is an existing survey
Issues in conducting impact evaluation • Confounding factors • Selection effects • Spillovers and contagion • Impact heterogeneity • Ensuring policy relevance
Confounding factors • Other things happen – so before versus after rarely sufficient • So get a control group… but different things may happen there • So collect data on more than just outcome and impact indicators • And collect baseline data • But …
Selection bias • Program placement and self-selection • Program beneficiaries have particular characteristics correlated with outcomes – so impact estimates are biased • Need to use experimental or quasi-experimental methods to cope with this; this is what has been meant by rigorous impact evaluation • But it is just one facet of impact evaluation design • Other things can also bias impact estimates
How to solve for selection bias • Experimental (randomized): • Limited application, but there are applications and it is a powerful approach • Many concerns (e.g. budget and ethics) and not valid • Quasi-experimental design (regression based): • Propensity score matching is most common • Regression discontinuity • Interrupted time series • Regression modelling of outcomes
Spillover and contagion • Spillover – positive and negative impacts on non-beneficiaries • Contagion – similar interventions in control areas • Need to collect data on these aspects and may need to revise evaluation design
Engagement 2 WHAT ARE THE MAJOR CONFOUNDING FACTORS FOR YOUR OUTCOME AND IMPACT INDICATORS? HOW MIGHT SELECTION BIAS, SPILLOVER AND CONTAGION AFFECT THE EVALUATION OF THE INTERVENTION YOU HAVE SELECTED?
Impact heterogeneity • Impact varies by intervention (design), beneficiary and context • ‘Averages’ can be misleading • Strong implications for evaluation design
Impact heterogeneity by design: complements or substitutes? • Is the impact of X and Y, bigger, equal to or less than the impacts of doing X and Y separately? • For example, hygiene promotion and sanitation facilities • Evidence suggestions they are substitutes- either one reduces incidence child diarrhea by 40-50%, but not by more if the two are combined
Impact heterogeneity by beneficiary: nutrition interventions • Irreparable damage to physical and cognitive development results from nutritional deprivation in the first two years of life • Hence interventions to infants have greater long-run impact on many outcomes than do those aimed at older children (such as school feeding programs)
Impact heterogeneity by context: expected impact of irrigation project under different scenarios
ENGAGEMENT 3 What sort of differences in impact would you expect for your intervention with respect to intervention (design), context and beneficiary?
Ensuring policy relevance • Process • Stakeholder engagement • Packaging messages • Design • Theory-based approach • Mixed methods • Capture all costs and benefits, including cross-sectoral effects • Cost effectiveness and CBA
THEORY-BASED EVALUATION • Make explicit underlying theory about how inputs lead to intended outcomes and impacts • Documents every step in causal chain • Draws on multiple data sources and approaches • Stresses context of why or why not working
Data collection • Need to collect survey data at the unit of intervention (child, firm etc) • Will need also facility/project data • Need data across the log frame and for confounding factors – and for your instrumental variables (lack of valid instruments is the major obstacle to performing IE) • Designing data collection instruments takes time and should be iterated with qualitative data
Group exercise OUTLINE YOUR PROPOSED EVALUATION DESIGN (TIMING OF DATA COLLECTION, IDENTIFICATION OF CONTROL, IF ANY) WHAT DATA SOURCES WOULD YOU USE FOR YOUR PROPOSED EVALUATION?
Thank you VISIT WWW.3IEIMPACT.ORG