480 likes | 1.2k Views
Measuring Service Delivery. Markus Goldstein DECRG/AFTPM. Spending ≠ outcomes. And the same for health…. Control for leakage and things look better…. Gauthier and Wane 2006 . Outline. An introduction to how we measure it Levels of analysis
E N D
Measuring Service Delivery Markus Goldstein DECRG/AFTPM
Control for leakage and things look better… Gauthier and Wane 2006
Outline • An introduction to how we measure it • Levels of analysis • Rich set of tools for measurement, but measure different things • Why measure service delivery • Accountability • Measuring poverty & designing a response • Evaluation • Policy relevant research
General organization of public service provision Central ministry District/State gov’t facility Service providers Potential clients Current clients
Administrative data Central ministry Information on resource flows information on clients served District/State gov’t Information on resource flows information on clients served facility information on clients served Service providers Potential clients Current clients
Tools: Administrative Data The basic tool to measure quality (and quantity) of service delivery Data collected from different levels Can provide extensive coverage (all clients) and pictures at different levels Other tools we will talk about are not substitutes for improving administrative data – should be in addition
Some ideas on decent quality administrative data Quality = credibility Timeliness – key for use and relevance Focus attention on a small set of relevant core indicators Make the analysis accessible and relevant to policymaker, service providers, managers, other user needs These things increase demand better and more data
Public Expenditure Tracking Surveys Central ministry Measures and verifies resource flows District/State gov’t Measures and verifies resource flows facility Measures and verifies resource flows Service providers Potential clients Current clients
Tools: PETS Diagnostic or monitoring tool to understand problems in budget execution delays/predictability of public funding leakage / shortfalls in public funding discretion in allocation of resources Data collected from different levels of government, including service delivery units Reliance on record reviews, but also school principal/health facility manager interviews Variation in design depending on perceived problems, country, and sector
Quantitative Service Delivery Surveys Central ministry District/State gov’t Information on facility functioning facility Service providers Potential clients Current clients
Tools: QSDS Generally used for evaluating efficiency of public spending and incentives Data can be collected on inputs, outputs, quality, costs, pricing, oversight, throughputs, outputs
Household surveys Central ministry District/State gov’t Measure usage, outcomes Measure use of alternatives, outcomes facility Service providers Potential clients Current clients
Tools: Household surveys Examples: Living Standard Measurement Surveys, Demographic and Health Surveys, MICS Will generally have detailed individual and/or household data on a wide range of characteristics • e.g. not just health seeking behavior, but also wealth levels • e.g. not just water source, but education levels Can be combined with facility surveys (e.g. 17 LSMS surveys) Collect data not only on clients, but potential clients
Absenteeism surveys Central ministry District/State gov’t Characteristics of providers and facilities check to see when providers are present and working facility Service providers Potential clients Current clients
Tools: absenteeism surveys Conduct random visits by enumerators during working hours to locate doctors/teachers. These visits could be randomized over few months. Some studies did two checks over the period of a few months others were visited around the official opening time and closing time of these facilities. In between that time, the team collects facility-specific and provider-specific information. There was no notification of the visit given before the survey team arrived at the facility.
Vignettes Central ministry District/State gov’t Quality of provider (e.g. skill set) facility Service providers Potential clients Current clients
Tools: vignettes Goal: to test the ability of medical personnel to diagnose and treat common conditions in a setting that is similar to their normal practice Structure: An enumerator gets trained as a sick person and the characteristics of the illness are predetermined. The practitioner must ask questions and perform physical examination for diagnoses. Provider then makes diagnoses as under normal circumstances. A competence index is then constructed based on the specific questions asked regarding the history of the case, the examination of the patient, the tests prescribed and the treatment given.
Tools: vignettes Variations: Other types of Vignettes include use of hypothetical scenarios where the practitioner is either asked to list the specific procedures he would use to diagnose a particular type of patient, or whether or not he would a particular procedure for a patient with specific symptoms, etc. Direct observation is another option where the behavior of clinicians with their own patients is studied. However, because the case mix varies between clinicians, it is difficult to compare across practitioners and not always relevant.
Exit surveys Central ministry District/State gov’t Client satisfaction, perceptions, informal payments, waiting time, etc facility Service providers Potential clients Current clients
Tools: exit surveys Exit polls for user satisfaction (can be done for patients alone or a sample of households if non-users are included). Data can also be collected through focus group discussions and report cards. Limitations of exit polls Problems in interpreting the subjective perceptions of health care quality “Courtesy bias", where individuals may provide responses that they are socially acceptable. Different to interpret because of important systematic differences across demographic and socio-economic groups, possibly making client perceptions poor proxies for objective assessments of different dimensions of quality.
Report cards Central ministry District/State gov’t Assessment of services and opinions facility Service providers Potential clients Current clients
Tools: report cards Citizen/Community-wide report cards: use a range of different tools to get information and opinions on prices, quality, waiting times, courtesy, etc. Can also be used to complement and support facility surveys. For example, the Bangalore report cards by the Public Affairs Center (PAC) in Bangalore summarize citizens' assessment of services provided by public agency officials and solicits opinions on specific aspects of service provision, including staff behavior, quality of service and communication of information, information on bribes paid in connection with service provision, etc.
Tools: report cards Citizen report cards Use a randomized survey questionnaire Community report cards Use focus groups Citizen reports easy to aggregate, but with community report cards and the need to reach consensus hard to aggregate Both will be colored by expectations (more on this later)
Reason 1: Accountability B A Source: WDR 2004
Provider-citizen leg (A) • Realized demand • How much is used, how much is paid, etc. • QSDS, exit surveys, administrative data, hh surveys • Satisfaction • e.g. length of wait for Dr, teacher’s performance • Report cards, questions in hh surveys, exit surveys • Is it correlated with objective measures of quality? Not always • Lundberg: vitals, examinations not corr w/ satisfaction Think about why you are doing this…
Reason 1: Accountability B A Source: WDR 2004
Government – provider leg (B) • Monitoring (administrative data) • Most effective when: • Routine collection, timely availability • Need sufficient quality • Adequate breadth, but not over burden providers • They have to be used • Can be used to draw inferences about program performance • Combine for impact evaluation, dose response (Galasso, Behrman and King) • Set service standards and measure relative performance
Government – provider leg (B) • Absenteeism surveys • Admin systems may get these data wrong • Facility surveys • Not a replacement for monitoring • Can get at broader, deeper data that would overwhelm monitoring system • Can get at more nuanced issues such as incentives, motivations and behavior
Government – provider leg (B) • Tracking the flow of resources: PETS • In-depth information on flows and losses • What is fraud, what is inefficiency, what are legitimate reallocations? • If there is a fairly open dialogue, this can feed into thinking about allocation rules in government
And what happens in A&B may impact C C B A Source: WDR 2004
And report cards may provide a way to get “C” moving C B A Source: WDR 2004
Reason 2: Understanding poverty & inequality and targeting the response • Whether we see poverty as income or multidimensional, measuring health & education is important • Understanding poverty & the service environment of the poor • LSMS surveys, didn’t originally contain facility component, 17+ do. • Link households to facilities they use (e.g. IFLS) • HH as starting point
Targeting the policy response • Separate out measures of quality that reflect the underlying poverty (development response) and those due to deficiencies in service delivery • Vignettes e.g. – why we can’t use whether a Dr. follows a protocol • Educated patients might encourage doctor, etc • Need to put Dr. through a vignette • Das & Leonard: Poor served by worse quality physicians
Targeting: natural disaster response • Frankenberg, et. al. – response to Tsunami • hh surveys + facility surveys + GIS information (combination) • What facilities were destroyed • But also: where population has moved so you can build back more appropriately (considering both disaster hit and surrounding areas) – get at dynamics
Reason 3: Evaluation, especially impact evaluation • IE defined: counterfactual construction • We can see this as part of both the citizen/gov’t links (demonstrating validity) and gov’t/provider links (what works) • Use service delivery data to look at marginal impacts of program exposure • Galasso uses phase in and time of exposure to look at outcomes (such as malnutrition)
Evaluation and impact evaluation • Evaluating a change in management • Look at how changes in service delivery (e.g. performance based pay for health care providers) changes welfare outcomes (e.g. child mortality) • Look at changes in service provision in its own right • Look at how increases in client voice/information change service delivery and outcomes • Bjorkman & Svensson: information on health provider performance and gov’t standards better health outcomes, perceptions of service • Look at impacts with heterogeneous treatment • Answer the question: what type of facilities provide this intervention with the greatest impact?
Reason 4: policy relevant research • Understanding the link between quality (e.g. skill of provider, infrastructure, etc) and client outcomes • Understanding the demand for services • Understand who the client are, who are not clients and why • Sampling is tricky…all available, all the hh uses?
Reason 4: policy relevant research • Understanding facility production processes • Going beyond, deeper than monitoring • e.g. whether facilities are at the optimal size (efficiency), are human and physical capital being used in the right proportions, what inputs are being wasted
Thank you Are You Being Served on the web: http://go.worldbank.org/F6KIIC0700
Central ministry District/State gov’t facility Service providers Potential clients Current clients
Perceptions unpacked (Lundberg) • Compare facility survey data with exit polls in Uganda • Significant correlations: • Waiting time (-) • Consultation time (-) • Treated politely (+) • Asked questions (+) • Not significant • Given physical exam • Touched during examination • Pulse taken