290 likes | 481 Views
The Evaluation Gap An International Initiative to Build Knowledge. Presentation by William D. Savedoff Sr. Partner, Social Insight Project Director, Center for Global Development Meeting of the DAC Evaluation Network Paris November 10, 2004. The Evaluation Gap Overview. Who are we?
E N D
The Evaluation GapAn International Initiative to Build Knowledge Presentation by William D. Savedoff Sr. Partner, Social Insight Project Director, Center for Global Development Meeting of the DAC Evaluation Network Paris November 10, 2004
The Evaluation GapOverview • Who are we? • What do we mean by an evaluation gap? • Why does this evaluation gap occur? • What other initiatives are being taken? • What are we doing? • Where are we going?
Who are we? Funding Secretariat Project Director
Working Group Members Who are we? Who are we? Members do not represent their institutions but participate in their individual capacity, bringing a wide range of experience from: • Evaluation Offices • NGOs • Development Agencies • Universities • Philanthropic Foundations
Nancy Birdsall Francois Bourguignon Esther Duflo Paul Gertler Judith Gueron Indrani Gupta Jean-Pierre Habicht Dean Jamison Patience Kuruneri Ruth Levine Richard Manning Stephen Quick William D. Savedoff Raj Shah Smita Singh Miguel Szekely Cesar Victora Working Group Members Who are we? Who are we?
What is “the evaluation gap”?Range of Analytical Work Who are we? What is the gap? • Project preparation studies • Monitoring implementation • Process & Operational evaluation • Outcome evaluation • Impact evaluation
Impact Evaluation Who are we? What is the gap? Studies that measure changes in a target population that can be attributed to a particular program or policy
Importance of Attribution • Different studies answer different questions • Attribution should not be ignored • Imagine a hypothetical Project for combatting HIV/AIDS in a large country with limited funds … First Round Beneficiary Communities
Importance of Attribution • Different studies answer different questions • Attribution should not be ignored • Imagine a hypothetical Project for combatting HIV/AIDS in a large country with limited funds … Second Round Communities First Round Beneficiary Communities
Excluded 27 b/c impact could not be documented 12 b/c too early or small scale Included 17 identified and documented Results fromMillionsSavedCases of public health interventions What is the gap? N
Results fromMillionsSavedCases of public health interventions What is the gap? “The gap in evaluation inhibits the documentation of successes, and prevents policymakers from being able to tell the difference between a well told story and a hard fact as they make decisions about which programs to support.”
Community Health Insurance What is the gap? • Community Health Insurance was proposed as early as 1978 in a WHO Technical Expert Report • What have we learned about community health insurance in the 36 years that have followed?
Community Health InsuranceILO/Universitas Review What is the gap? None of the studies measured impact on health
Community Health InsuranceEkman, Health Policy & Planning (2004) What is the gap? • Review of studies that analyzed: • Resource Mobilization • Financial Protection Descriptive (14) Multivariate statistical (5) Descriptive Non-Statistical (22) Other (2)
Why does this gap occur? Why? • Knowledge from impact evaluations is a public good • Costs are visible, benefits seem far-off & resources are limited • Demand is diffuse (institutions & time) • “It Pays to be Ignorant” • Low-quality studies crowd out the good • Methodological challenges
Guarded optimism Why? • Changing profile of agency staff • Worldwide capacity for good studies • Recognition of impact evaluation value • Pressure from skeptical donors • Methodological & practical advances in research design (also highly visible & successful) • PROGRESA/Oportunidades, IMCI, Guinea Worm, etc.
Other Initiatives Other Initiatives • Increasing access to existing information • Developing aggregate indicators • Improving capacity • Promoting evaluation w/funds & data • Conducting research & demonstrating good evaluation practices
Some Examples Other Initiatives • OECD/DAC Evaluation Network • World Bank Research Department • Health Metrics Network • UN M.E.R.G. • USAID/MACRO Surveys • MIT Poverty Action Lab
We need an initiative that: • Focuses specifically on the Public Good aspect of impact evaluation • Develops a collective response to the problem • Mobilizes & appropriately channels new funds • Acts selectively where the most can be learned
The Evaluation GapWorking Group Process What are we doing? • Preliminary interviews & research • Convene working group • Meetings, teleconference, and e-list debates • Consultation group • Draft paper & action plan • Dissemination and broader debate
Likely characteristics of recommendations Where are we going? • Identifying a reliable source of funding • Establishing collective mechanism for selecting “enduring questions”, guidance, involvement & commitment • Developing institutional mechanisms for channeling funds into appropriate studies & projects
What do you think? Where are we going? • What is the fundamental problem from your perspective? • Do you have other examples (reviews?) • What are your current initiatives? • What kinds of solutions should be considered? • Who else should we consult?
Contact Us Where are we going? Center for Global Developmentwww.cgdev.org • Nancy Birdsall, Presidentnbirdsall@cgdev.org • Ruth Levine, Senior Fellowrlevine@cgdev.org • William D. Savedoff, Project Directorsavedoff@socialinsight.org
References • Christensen, Jon. Asking the Do-Gooders to Prove They Do Good. The New York Times . 1-3-2004. • Development Assistance Committee. Principles for Evaluation of Development Assistance. Organisation for Economic Co-operation and Development. 1991. Paris, OECD. 2004. • Development Assistance Committee. Review of the DAC Principles for Evaluation of Development Assistance. Organisation for Economic Co-operation and Development. 1-120. 1998. Paris, OECD. 2004. • Development Assistance Committee. Glossary of Key Terms in Evaluation and Results Based Management. 6, 1-37. 2002. Paris, OECD. Evaluation and Aid Effectiveness.
References • Dugger, Celia. “World Bank Challenged: Are the Poor Really Helped?” The New York Times . 7-28-2004. New York. • Ekman, Björn. "Community-based health insurance in low-income countries: a systematic review of the evidence." Health Policy and Planning, 2004, 19 (5), 249-270. • France, Ministère de l'Economie des Finances et de l'Industrie. Partners in Development Evaluation: Learning and Accountability. Partners in Development Evaluation: Learning and Accountability. 3-25-2003. Paris.
References • International Labour Office. 2002. “Extending Social Protection in Health Through Community Based Health Organizations: Evidence and Challenges”, Dicussion Paper, Universitas Programme, ILO, Geneva. • Kremer, Michael. "Randomized Evaluations of Educational Programs in Developing Countries: Some Lessons." American Economic Review Papers and Proceedings, 2003, 93 (2), 102-115. • Pritchett, Lant. "It Pays to Be Ignorant: A Simple Political Economy of Rigorous Program Evaluation." The Journal of Policy Reform, December 2002, 5 (4), 251-269.
References • Victora, Cesar G, Habicht, Jean-Pierre, Bryce, Jennifer. "Evidence-Based Public Health: Moving Beyond Randomized Trials." American Journal of Public Health, March 2004, 94 (3), 400-405. • World Bank. Influential Evaluations: Evaluations that Improved Performance and Impacts of Development Programs. 1-24. 2004. Washington, DC, World Bank. • World Health Organization. Financing of Health Services. 625. 1978. Geneva. Technical Report Series.
EG Working Group Members • Nancy Birdsall, President, Center for Global Development • Francois Bourguignon, Chief Economist & Sr. Vice President, World Bank • Esther Duflo, Associate Professor of Economics, MIT • Paul Gertler, Professor of Economics, Haas School of Business • Judith Gueron, President, MDRC • Indrani Gupta, Reader, Institute of Economic Growth • Jean-Pierre Habicht, Professor, Cornell University • Dean Jamison, Senior Fellow, National Institutes of Health • Patience Kuruneri, Senior Policy Analyst, World Health Organization • Ruth Levine, Senior Fellow, Center for Global Development • Richard Manning, Chair, Development Assistance Committee • Stephen Quick, Director, Inter-American Development Bank • William D. Savedoff, Senior Partner, Social Insight • Raj Shah, Senior Policy Officer & Senior Economist, Bill & Melinda Gates Foundation • Smita Singh, Special Advisor for Global Affairs, William & Flora Hewlett Foundation • Miguel Szekely, Ministry of Social Development, Mexico • Cesar Victora, Professor, Universidade Federal de Pelotas, Brazil