300 likes | 397 Views
The Value of Disciplinary Diversity Evaluating Community-Driven Development. Vijayendra Rao, World Bank. OUTLINE. Some Limitations of impact evaluations. Some Concerns about randomization. Alternatives and Complements Added Value of Qualitative Methods.
E N D
The Value of Disciplinary DiversityEvaluating Community-Driven Development Vijayendra Rao, World Bank
OUTLINE • Some Limitations of impact evaluations. • Some Concerns about randomization. • Alternatives and Complements • Added Value of Qualitative Methods. • Evidence for Community Driven Development (CDD) • Open Questions About CDD • Methodology of Indonesia CDD Evaluation
General Concerns • Question Should Drive Method And Not Method The Question • Intervention versus Nothing. • Internal versus External Validity • Impact versus Process
Impact Versus Process • “Process”: What were the series of events instigated by the intervention that led to the outcome? (e.g. local politics, interactions between govt. official and school headmaster). • Cannot be easily anticipated enough to fit into a structured questionnaire. • What is the real “impact.” Positive and Negative Externalities. (e.g. project to improve condom use leads to empowerment of prostitutes)
Internal Versus External Validity • Contextual differences may be missed. (Role of culture, social structure, politics and geography in determining impact.) • Differences in Implementation (Learning by Doing) • Was there any spill-over, and to what extent? How did this happen? • Rubber Hits the Road (Scaling-Up)
Additional Concerns About Randomized Trials Solves the Problem of Selection Bias BUT General Concerns Can Be Accentuated: - Randomization Bias, Hawthorne Effects • Experimental versus real world • Political Constraints (Short term political gains override scientific concerns) • Ethical Concerns
Imperfect Solutions Substitutes and Complements
Non-Experimental Methods • Propensity Score Matching (Depends on exhaustive set of observed indicators) • Discontinuity (Super-local validity) • IV (difficult to find)
Monitoring As Evaluation • Complementary and Essential • MIS Data Bases • Pilots within an Intervention (E.g. Olken) • Facilitators as Key Informants • Helps Process of Learning by Doing
Qualitative Methods • Complementary but Adds to Cost • Participant Observation • Focus Group Discussions/PRA • In-Depth Interviews (Key Informants) • Textual Analysis (e.g. Local Newspapers)
Added Value of Qualitative • Think Quantitative act Qualitatively • Help Choose Identification Strategy (e.g. Sex Workers Study) • Help Measure Outcomes (e.g. CDD evaluation in Indonesia) • Help Track Processes (e.g. NREGA Study, CDD Evaluation Indonesia). • Triangulation
Community Driven Development (CDD) • Directly Give Community Access to Untied Funds • Community Based Targeting • Community Based Management • $ 7 Billion at the World Bank
Well Targeted Improves Supply and Quality of Public Services Improves Capacity for Collective Action – “Social Capital” More Sustainable Risk of Elite Capture is Low Can be Scaled Up Some Claims of CDD
Poverty Targeting • Easier to Target Poor Communities than Poor within Communities • Generally Speaking – CBT works better than external methods • Heterogeneity Matters • “Preference Targeting” Poor
Service Delivery • Community Involvement generally seems to produce effective projects . • No Evidence that it Causes Projects to Improve • No evidence of whether non-participatory alternatives work better.
Participation • Lots of evidence that high levels of participation are positively correlated with project effectiveness. • NO CONVINCING EVIDENCE OF CAUSAL LINK
Social Capital Creation • Lots of Evidence on Strong Correlation • But Very Little Convincing Evidence on Causal Links
Sustainability • AGAIN LIMITED EVIDENCE • Anthropological work points to crucial role of support from higher levels of government.
Elite Capture • Social Networks affect who benefits • Generally speaking – elites tend to dominate • Benevolent vs Malevolent Capture
Role of External Agents • Central to Local Level Project Effectiveness -but understudied • Good Facilitators are charismatic leaders, trainers, anthropologists, engineers, economists, and accountants
Scaling-Up Challenges • Low Experience, Poor Training of facilitators. • Poor Monitoring and Evaluation – “Praise Culture” • “Supply driven demand driven development” (Voices of the Bank)
UPP2 Project - Indonesia • $200 Million in 3 provinces • “Urban” • “Kelurahan” Based • $20,000 per community of 10,000 • Committee Elected to Manage (BKM) • Village Infrastructure Groups • Micro-Credit Groups
Quantitative Evaluation • Discontinuity Design • Sample: Approx. 200 treatment, 200 control • Baseline, Mid-term, Follow-Up
Questionnaires • 3 weeks fieldwork on UPP1 projects • “Social Capital/Gotong Royong” • Govt. and Community Projects • Credit Group Participation • Socio-economic • Questionnaires to Leaders and Activists • FGD – Oligarchy, Community Activities • Facilitators
Qualitative • 12 Control, 12 Treatment • Team of 4 spend 10 days for each round. • 6 FGDs • 20-30 In-depth Interviews • Mapping of Community
Qualitative continued.. • Baseline – tracks initial conditions • Mid-Term - Tracks formation of groups (BKM, Infrastructure and Credit) - Tracks Facilitation Process • Final
Monitoring Systems • MIS Data Base • UPP Website • Facilitator Feedback and Learning
Baseline Findings • Public Goods (40% Govt., 60% Community Contributions) • High-Level of Civic Participation (Gotong Royong) • Overlap between Religious and Govt. Institutions
References • E. Duflo, R. Glennerster, M. Kremer, “Using Randomization in Development Economics: A Toolkit” CEPR Discussion Paper No. 6059, January 2007 • G. Mansuri and V. Rao, “Community-Based and Driven Development: A Critical Review”, World Bank Research Observer, vol. 19, no. 1, pp. 1-39 2004 • V. Rao and M. Woolcock,” Integrating Qualitative and Quantitative Approaches to Program Evaluation,” Chapter 8 in Francois Bourguignon and Luiz A. Pereira da Silva (editors), The Impact of Economic Policies on Poverty and Income Distribution: Evaluation Techniques and Tools, World Bank and Oxford University Press, 2004. • M. Ravallion, “Evaluating Anti-Poverty Programs,” T.P. Schultz and J. Strauss (editors) Handbook of Development Economics (forthcoming)