250 likes | 462 Views
Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation and Gender in Finance and Private Sector & Agriculture and Rural Development : What, How and for Whom. Two new DIME programs. AADAPT-Impact Evaluation in Agricultural Adaptations
E N D
Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation and Gender in Finance and Private Sector & Agriculture and Rural Development : What, How and for Whom
Two new DIME programs • AADAPT-Impact Evaluation in Agricultural Adaptations • DIME-FPD-Impact Evaluation in Finance and Private Sector To: • Improve the quality of ARD and FPD operations • Build capacity for evidence-based policy in relevant government agencies • Generate knowledge on ARD and FPD policies and interventions
With global outreach • AADAPT: Africa, Latin America, MENA, South Asia • DIME-FPD: Africa, Europe, Latin America, MENA, South Asia
DIME new business model • Strengthen analytical content of operations • Bank’s research team work with operations throughout the project cycle • Project develops set of operational and testable options • Options are tested and selected for scale up • Improve the quality of operations from design to completion. • Use result-based operations to demonstrate to governments how evidence-based policy-making works in practice
The evaluative process is not one-shot • The impact evaluation product delivers advice to clients at many points in the policy decision cycle. • Training and sharing of evidence during project preparation ensures prior evidence included in design. • Availability of data improves reporting and strengthens the monitoring function. • Testing operational alternatives guides implementation. • Measures of program effectiveness validate the assumptions in the results framework
Align local incentives • Importance of local knowledge and the need for tailored solutions • Through a process of capacity development and facilitated discussion, each evaluation responds to a client-defined learning agenda. • Aligns the incentives for knowledge generation to local needs by focusing on client ownership and operational relevance. • Local knowledge feeds into the improvement of local programs.
Capacity is built through • Formal training, • Networking with a larger community of practitioners, and • Learning by doing through joint evaluations with the Bank. This helps policy makers understand the tools put at their disposal and take ownership.
Communities of practice • Clients become members of a club of peers from multiple countries and with access to international experts. • Periodic cross-country activities provide clients and Bank operations with a forum to compare and benchmark their results and learn from the experience of others. • Here some general lessons can be drawn out of local experience.
Teasing out global lessons • Once a significant body of work is completed, DIME works to synthesize results. • The synthesis is facilitated when the individual evaluations share a common analytical and measurement framework
What is Impact Evaluation? • Impact evaluation measures the effect of an intervention on outcomes of interest relative to a counterfactual (what would have happened in the absence of) • It identifies the causal effect of an intervention on an outcome separately from the effect of other time-varying conditions
To do this we need a good counterfactual? • Treated & control groups have identical observed and unobserved characteristics • The only reason for the difference in outcomes is due to the intervention • How? • Assign intervention to some and not some other eligible populations on a random basis or on the basis of clear and measurable criteria • Obtain a treatment and a control group • Measure and compare outcomes in those groups over time
What, How and for Whom? • Impact evaluation measures the effect of a program—accountability Does it work? • Can guide the implementation of a program by comparing alternatives—manage for results on the basis of rigorous evidence How? • Can help understand for whom it works For whom?
The decision process is complex • A few big decisions are taken during design but many more decisions are taken during roll out & implementation • Large number of operational choices
How can it be made to work? • Experimentally compare operational alternatives side by side • Scale up better options and discard others • Understanding how a policy can be made to work in a specific context is the key to success
Does one size fit all? • Impact evaluation measures the average effects • Effects are different for different people (young/old, rich/poor, educated/non educated, male/female) • When we incorporate these dimensions in the analysis we learn how to shape interventions for different populations
Why gender? • Incorporate gender dimensions in policy interventions and learn from impact evaluation how to make gender policy work for development • Much evidence suggest gender matters for development
Venues for gender to affect development • Women have different preferences and take different decisions than men at home and for policy • Position of weakness in the household may reduce household overall productivity through unequal sharing of resources • Rules, constraints and disadvantages may reduce productivity in the economy
Gender factors that can be addressed through policy • Perceptions • Differential access to land, inputs, capital, output markets • Traditional rules on duties, movement, household decisions • Different formal or informal rights on property
How impact evaluation can help • Hypothesize factors that may induce inefficiencies in the context of your program • Think about what policy interventions may address them • Test policy alternatives rigorously • Impact evaluation will separately isolate the effect of a particular intervention from that of other interventions of factors • There is currently little impact evaluation evidence on gender differentiated program effects • AADAPT and DIME-FPD, in collaboration with the GAP, will support governments build the evidence
How to measure gender differentiated effects • Measure differential effects on men and women for the same interventions • Larger samples • Different data collection strategy • Additional indicators • For each type of intervention, measure spillover effects on the targeted individual as well as other members of the households who may be affected (wife of the head, daughters) • or, Target men and women with different interventions and measure effects on men and women
Conclusions • It is not enough to know if a policy works on average • We must know how to make it work better and for whom • Impact evaluation can help on all three counts • DIME-GAP collaboration is a step in this direction and available to help you do it