1 / 35

Implementing Quantitative Service Delivery Surveys: Some lessons from schools surveys

Implementing Quantitative Service Delivery Surveys: Some lessons from schools surveys. Deon Filmer Development Research Group The World Bank Are You Being Served? November 3 2009. Why carry out a Quantitative Service Delivery Survey?. Outcomes are low ….

fairly
Download Presentation

Implementing Quantitative Service Delivery Surveys: Some lessons from schools surveys

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Implementing Quantitative Service Delivery Surveys:Some lessons from schools surveys Deon Filmer Development Research Group The World Bank Are You Being Served? November 3 2009

  2. Why carry out a Quantitative Service Delivery Survey? • Outcomes are low …

  3. Starting points: Learning outcomes are low Inequality in TIMSS 2003 Mathematics test scores Source: Analysis of TIMSS 2003 database

  4. Why carry out a Quantitative Service Delivery Survey? • Outcomes are low … … is it a lack of money?

  5. Public spending is not enough to improve outcomes Pattern across countries * Difference in logs (x100) form rate predicted by GDP per capita Source: WDR 2004

  6. Public spending is not enough to improve outcomes Pattern across primary schools in Mauritania

  7. Similar changes in public spending can be associated with vastly different changes in outcomes… Source: WDR 2004

  8. …and vastly different changes in spending can be associated with similar changes in outcomes.

  9. How to assess the lack of association between spending and outcomes? • Public spending benefits the rich more than the poor • Expenditure incidence analysis of public spending for diagnosis • Lack of demand by households • Impact evaluation of programs to promote demand • Money fails to reach frontline service providers • Public expenditure tracking surveys (PETS) • Poor quality services • Quantitative Service Delivery Survey (QSDS) • e.g., absenteeism, time on task

  10. Often, public spending benefits the rich more than the poor Expenditure incidence of public spending on education

  11. Expenditure incidence can be changed through reallocation of public spending Expenditure incidence of public spending South Africa* Malawi * Primary and Secondaryonly

  12. Lack of demand by households Impact of demand-side programs Impact of a conditional cash transfer on girl’s and boy’s middle school enrollment Impact of a scholarship program on girls’ enrollment in Cambodia: Enrollment probability among recipient and non-recipient girls by economic status decile Source: Filmer and Schady (2008)

  13. Disbursed public spending on school grants that actually reach schools Percent of school grants that actually reach schools

  14. Outputs / Outcomes School Money Inputs $$ Quality of public services Classic approach to analyzing education outcomes… … QSDS are a way to get inside the “black box” of service delivery at the facility level

  15. What are Quantitative Service Delivery Surveys? • Take the facility (or staff) as the unit of analysis • Could be complemented with a household/users survey • Collect quantitative information about • Physical infrastructure • Staff characteristics • Income and expenditures • Governance and management • Characteristics/Quality of service provision • Outcomes

  16. Two different sets of surveys • Indonesia: • December 1998: Early days of economic crisis … were schools feeling any impact • April/May 2000: Longer-run school-level impacts of the crisis, decentralization looming • PNG • April/May 2002: little knowledge about the status of services in PNG; particular interest in decentralization; explicit concern about expenditure tracking

  17. Some lessons from experience, with a focus on two different sets of surveys • Indonesia: • 600 schools • 5 purposively selected provinces • 15 districts (40 schools per district) • PNG • 220 schools • 8 purposively selected provinces • 2 districts (10 schools per district)

  18. Activity structure • Indonesia: • Close collaboration with research department of ministry of education. • Ministry staff served as full partners in pilot/questionnaire development; served as regional survey supervisors. • Gave the survey some legs within the ministry, enabled substantially lower costs … but cost in terms of capacity and experience. • Study conceived of as stand alone survey, with Ministry/policymakers as primary audience. • PNG: • Partners with National Research Institute, an independent agency • Overseen by “working group” with various government, NGO, and other representatives. • Little hands on input from Ministry of Education. • Study conceived of as a part of WB Poverty Assessment.

  19. What worked well • Indonesia • Trends in enrollments at the school level • Non-conventional wisdom result that enrollment impacts were mainly urban and at the secondary level; and in non-private/non-secular schools. • But difficulty: enrollment levels/trends … not enrollment rates. • Perceptions of impact of crisis • Identified “general impact” and “school functioning” as two main impacts (exploratory PC analysis) • Status of crisis-relief government programs (scholarship and grant programs) • Schools grants: Coverage; use; interesting substitution between grants and other sources of government (especially local government) sources of funding • Scholarships: Coverage (among students) • Trends in charging of fees

  20. Substitution between grants and local government funding Indonesia 2000: Sources of school funding by grant receipt and public/private status Primary schools Junior Secondary schools In public schools, local government spending adjusted in response to grant No adjustment in private schools

  21. Substitution between grants and household spending Zambia 2001: Effect of a 100 Kwacha increase in expected and unexpected school grants on household expenditures on education Household spending falls by about 45 for each additional 100 Kwacha spent on anticipated grants Source: Jishnu Das, Stefan Dercon, James Habyarimana, Pramilla Krishnan (2004)

  22. What worked well • PNG • Descriptive status of schools (very little prior information) • Good documentation of delays in subsidies / teacher pay • Reasonable assessment of teacher absenteeism (pre-announced window for visit) • Good data to construct “ghost teacher” estimate (with substantial effort in matching to government payroll records)

  23. Delay in ability to use subsidy: PNG 2001 Percent who received any subsidy Weeks delay Note: Q1,Q3=National, Q2,Q4=Provincial

  24. Absence rates among teachers and health workers Note: Surveys were all fielded in 2002 or 2003. Sources: Chaudhury et al (2006) except for PNG, World Bank (2004) and Zambia, Das et al (2005).

  25. Results from QSDS:Effective supply of teachers PNG 2002: Depletion of the effective supply of teachers Source: PESD 2002.

  26. Percent of time officially allocated to schooling; when a teacher is present; and spent in teaching and learning activities Beyond absenteeism: Effective supply of teaching Sources: Egypt, Yemen and Lebanon from Lane and Millot (2002); Tunisia, Pernambuco, Morocco and Ghana from Abhadzi, Millot and Prouty (2006); Cambodia from Benveniste, Marshall and Araujo (2008); and Laos from Benveniste, Marshall and Santibanez (2007).

  27. Once there, and teaching … competency? The Gambia: Percent of 4th and 6th grade teachers answering student-level math questions correctly Source: The Gambia Impact Evaluation Team (2009). N=1049 Teachers.

  28. Once there, and teaching … competency? The Gambia: Percent of 4th and 6th grade teachers answering student-level literacy questions correctly Source: The Gambia Impact Evaluation Team (2009). N=1049 Teachers.

  29. Investigating accountability in education service delivery PNG 2002: Teacher absence declines with parent and community involvement Source: PNG PESD 2002.

  30. What was harder • Indonesia: • Trends in overall school incomes—never clear we had full picture (what we did have was worrisome, especially for private schools) • But, incredibly complicated system … is this worth doing when the system is so complex? • PNG: • Complex funding system … but able to track some specific payments (school subsidies) • But … school financial data very spotty • only about half of the schools had documentation about spending, half about receipts • Only 30% of schools had both expenditures and receipts documentation

  31. Funding education in PNG2001, million Kina Q1,3 Q2,4 Source: Based on information collected during the PESD 2002 survey.

  32. What I would think twice about doing again • Enrollment trends (unless have information on universe of schools and on population trends by area) • Hard (time consuming) to collect, hard to interpret • Too many instruments • PNG had 9 instruments, 7 at the school level. • Non-representative/random sample of parents

  33. Survey instruments in PNG: • School (head teacher); • teacher roster; • select teachers; • data appendix; • grade 5 teacher; • board of management member; • parent; • District Education Advisor; • Provincial Education advisor.

  34. I would think (very) hard about what financial data to collect • The more specific the better • But even there, school officials often don’t associate specific transfers to “official” name • Anything more than tracking a clearly defined transfer is very hard. Even that is hard: • missing information at schools; • missing records at provincial level; • defining the “base” • Official declarations in Government Circulars • Budget disbursements • School level expectations

  35. What I would never do again • Data entry using a package not designed for that purpose • (Data entry using a package not designed for that purpose) • Complex survey/tracking exercise in a country where policy environment not conducive to use information

More Related