1 / 24

Impact Evaluation for International Development: Why we need it and why it is hard

Impact Evaluation for International Development: Why we need it and why it is hard. Irene Guijt. March 25, 2013 CDI Conference on Impact Evaluation. Copy right statement. On the ideas asserted in this power point Intellectual Property Rights apply. Appropriate citation:

Download Presentation

Impact Evaluation for International Development: Why we need it and why it is hard

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Impact Evaluation for International Development:Why we need it and why it is hard Irene Guijt March 25, 2013 CDI Conference on Impact Evaluation

  2. Copy right statement • On the ideas asserted in this power point Intellectual Property Rights apply. • Appropriate citation: • Guijt, I. (2013). Impact Evaluation for International Development: Why we need it and why it is hard. Keynote speech for CDI conference on Impact Evaluation, March 25 and 26, 2013.

  3. Pendulum of IE

  4. Key messages • Crucial and growing area of evaluation expertise • Much contestation – definitions, methods, standards, utility • Utility of IEs remain to be proven

  5. Impact Evaluation • What is it? • Why do it? When is it appropriate? • Who’s involved? • How? • Options for key tasks • Quality • Use Controversial? All of the above!

  6. small ‘e’, big ‘E’ ‘Results and targets’ vs ‘(Why) do or don’t things work?’ Seeds of interest • Aid ‘failure’ • Financial crisis • Shifting relationships (economic & players) • Others?

  7. When to invest • Testing innovation for scaling up • Risky context, investment utility • Accountability for high investments • Downward accountability (?) Policy (and practice) direction for the future: • Will the past be a good predictor of the future? • When what works, doesn’t work? • And when what doesn’t work, works? Learning vs accountability

  8. Questions (that) matter • Which questions matter? • Whose questions are heard • Which theories or strategies do we need to question • Who casts a vote • 3ie  enduring questions, eg are school vouchers the solution, what works in reducing FGM, market access for the poor, etc • MethodsLab rolling out an innovation within a flagship sector (25% of budget)

  9. Defining Impact • the positive and negative, intended and unintended, direct and indirect, primary and secondary effects produced by an intervention (OECD) • I = Y2 – Y1 = (I) evaluation as a rigorous estimation of the difference between indicators of interest with (Y1) and without the intervention (Y2) (Howard White, 3ie)

  10. People and Money “An IE industry has developed which believes it is doing good work.” (Morton et al 2012) Demand for IE • Development banks • Bilateral aid • Multi-donor programs • NGOs Supply for IE expertise • JPAL network (2003 - Abdul Latif Jameel Poverty Action Lab ) “Conducting Rigorous Impact Evaluations: J-PAL researchers conduct randomized evaluations to test and improve the effectiveness of programs and policies aimed at reducing poverty.” • 3ie (2009) • university departments, esp. economists • CLEAR Network (World Bank funded) • consultants

  11. The case of 3ie • 2004 Origin with Closing the Evaluation Gap Initiative (Centre for Global Development), Hewlett Foundation, BMGF • 2006 paper ‘When Will We Ever Learn’, the manifesto of the Impact Evaluation movement • 2008/09 3ie established, $55.14 million (2010-2013) Supply and demand • Quality standards for rigorous evaluations • Review process for designs/studies • Identifying priority topics • Provide grants for IE designs

  12. 3IE Review Conclusions (Morton et al 2012) • Membership disappointingly low • 10 bilateral and one multilateral donor agencies; 6developing country governments/agencies; 2 philanthropic foundations; 3INGOs • Policymakers, implementers and donors: IE low priority; sizeable minority of practitioners not of experimental IE • What is considered ‘good’ is funded, what is funded shapes the face of IE • 139 specialists, wide review net, esp US universities focused on experimental IE • Journal of Development Effectiveness since 2009, respected, used (4000 times in 2010) • To date 9 SRs and 9 IEs finalised but website includes many more studies

  13. Others emerging (partial listing!) • INGOs • ICCO, World Vision, Oxfam, Plan International, CARE, Hivos, Save the Children, Veco, Freedom from Hunger • Multilaterals • RCT studies undertaken for many UN agencies • IFAD developing own approach (PIALA) • Bilaterals • GiZ, AfD, AusAid, DfID, USAID, DGIS…. • Philanthropic Foundations • BMGF, Hewlet, Packard, Rockefeller….

  14. It has to be ‘rigorous’ • Rigour = value judgement • USAID: ‘attributable to a defined intervention; im- pact evaluations are based on models of cause and effect and require a credible and rigorously defined counterfactual’ • PIALA: data, sensemaking, utility, low resource efforts • Five standards in evaluation – which one wins? • Utility or accuracy • Relevance or objectivity • Hard in practice • Eg 3IE’s own standards, the 2012 review was very critical of IEs funded through 3IE

  15. How does one do IE? • Gold standard of what • Statistics or rigour of thought • Experimentalism vs ‘other’ • A method or naïve experimentalist belief • Attribution • Attribution as one form of contribution • Counterfactual • rigour or philosophically dead?

  16. Beyond a counterfactual: the shape of change

  17. Emerging alternatives • Realist evaluation framing • Contribution analysis • People’s narratives and surveys for attitudes / behaviour shifts • Qualitative Comparative Analysis • Participatory impact assessment and learning approach • PADEV …

  18. Participatory impact assessment and learning approach (PIALA) • IFAD committed to 30 IEs • Needs an approach that works for low resource programs, must show progress towards 80 million, weak baselines • PIALA = statistically sampled in nested hierarchy, PRA and survey data collection plus project data, collective sensemaking

  19. GirlHub (DfID and Nike Foundation) • Impact of ‘starting a national conversation to revalue girls’ • Focus – attitude and behaviour change as a result of social media work • Control group hard – radio, magazine • Purposively, stratified sample: detailed survey and self-signified stories of change, before and after, supplementary context analysis

  20. Beyond only data collection: IE is about more than just describing ‘impact’ well betterevaluation.org

  21. Use Uptake? • Evidence of evidence use is weak Assumptions about uptake • Naïve politics • Many, statistically rigorous, published IEs will lead to use • IDRC 90+ variables that influence uptake Have to deal with the boundary between research and policy uptake Uptake requires fostering interest = we need to understand the psychology of use, as well as the politics of use

  22. Assumptions in 3IEs Theory of Change • overriding need is for more rigorous evidence • quality will automatically be a result of more IEs of a certain type • research community will generate enough policy relevant proposals and it will only be necessary to select the most policy relevant • policy influence requires major effort from onset of research

  23. Future of IE • Audience • Convince policymakers that rigorous IE important for policy process • Feasibility and rigour • Fit to context = (quasi) experimental studies difficult (conditions, questions, capacities) • Relevance for policy • Inverse relationship? most policy relevant programmes least amenable to experimental IE techniques

  24. Pendulum of IE

More Related