1 / 12

Impact Evaluation in Mexico

Impact Evaluation in Mexico. Harry Anthony Patrinos HDNED. Country Context for Evaluation. Capacity in M&E and evaluation precedents Good education data/information Lack of openness on information in education Lack of capacity in education for IE. Progresa/Oportunidades.

Download Presentation

Impact Evaluation in Mexico

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Impact Evaluation in Mexico Harry Anthony Patrinos HDNED

  2. Country Context for Evaluation • Capacity in M&E and evaluation precedents • Good education data/information • Lack of openness on information in education • Lack of capacity in education for IE

  3. Progresa/Oportunidades • Large scale CCT program • Randomized design and phased roll out • External and internal capacity for IE • Results published and used for programming • Led to evaluation institutions, capacity throughout, legal framework • Pressure on other sectors

  4. Education Impact Evaluations pre-2005 • Isolated efforts, not large scale, few published • No culture of IE • No randomized experiments • But interest in results and promoting good programs

  5. Education Impact Evaluations post-2005 • PAREIB – Compensatory rural education • PEC – School-based management in urban disadvantaged areas

  6. Education Impact Evaluations post-2005 • PAREIB – Compensatory rural education • Large rural program targeting disadvantaged rural schools, expansion to all 32 states based on evaluation in early 1990s (led by Bank) • Little interest in IE after that, but considerable capacity to collect information, monitor program • Considered successful program • PEC – School-based management in urban disadvantaged areas • “Pilot” program, 2001, one of first to start with evaluation • But design deficient, not random, led to comparison problems • Considered up until last year as not successful

  7. Education Impact Evaluations post-2005 • PAREIB – Compensatory rural education • Bank undertakes simple matching study • Original resistance by authorities • Positive results led to interest and further work • PEC – School-based management in urban disadvantaged areas • Bank asked to explain their lack of significant findings • Incorporates PEC in AAA and conducts IE • Shown to be positive (verified by external evaluations as well)

  8. Pragmatic Design • Used existing administrative data, project data, MoE data and surveys, Oportunidades data • Initially Bank-led, then partnered with authorities and local researchers • Now client carrying out, publishing, designing evaluations

  9. Results • PAREIB: Matching before project • New team focused on research, first randomization of 1 component planned • PEC: Before approval, weak design • Bank study • Harvard study • Led to new design in project

  10. Results 2 • Quality of dialogue improves • Project design more solid • Financing project becomes sustainable

  11. Results 3 • Justified follow-on operation • Lent credibility to investments, Bank role • Being used for public debate

  12. Taking Advantage of Opportunities • Look at administrative data • Involve operational staff • Partner with program operators • Know incentives of operators and client

More Related