1 / 24

Evaluating the Challenges of Performance-Based Contracting in Uganda's Health System

This study evaluates the implementation, context, and complexity of health system interventions in Uganda, focusing on a performance-based contracting experiment conducted between 2004-2006. The examination includes design features, implementation issues, and the impact on hospital performance.

ramost
Download Presentation

Evaluating the Challenges of Performance-Based Contracting in Uganda's Health System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Why performance-based contracting failed in Uganda: evaluating the implementation, context and complexity of health system interventions Freddie Ssengooba , Barbara McPake and Natasha Palmer www.qmu.ac.uk/iihd

  2. Background • DRG of World Bank and Uganda MoH instituted a randomised experiment to test performance based payment for non-profit health providers in 5 districts of Uganda • Implemented between (about) 2004-2006 www.qmu.ac.uk/iihd

  3. The DRG’s own evaluation is a ‘black box’ type ‘Intervention’ ‘Performance’ www.qmu.ac.uk/iihd

  4. Dynamic responses: How people (‘users’ and ‘providers’) react and interact in response to formal structures and rules De facto system: Services as experienced by (poor) people For example: access; quality De jure system: Organisational structures Intended incentives Management procedures Training courses Dynamic responses model of the health system Source: McPake, Blaauw and Sheaff, 2006 www.qmu.ac.uk/iihd

  5. Our evaluation focused on looking inside all three boxes ‘Intervention’ What did it really consist of? Design features Implementation features Who came into contact with the intervention? How did they react? How did they influence others? What chains of effects were initiated and how was hospital performance affected? ‘Performance’ What has been measured? What has not? www.qmu.ac.uk/iihd

  6. The DRG design in more detail • Randomised trial • ‘Control group’ – continued with pre-existing arrangements (received government grants and used them according to guidelines) • ‘Autonomy group’ allowed to allocate government grants without restrictions of guidelines • ‘Bonus group’ received grant as before; allowed to allocate grants with autonomy; received bonus if they achieved or exceeded targets www.qmu.ac.uk/iihd

  7. Experiment in 5 districts (/56 at the time) • 68 PNFP and 52 government facilities participated • Service outputs monitored www.qmu.ac.uk/iihd

  8. DRG Conclusion: • ‘..assignment to the performance-based bonus scheme has not had a systematic or discernible impact on the production of health care services provided by PNFP facilities. … it appears that facility autonomy in financial decision-making has a positive impact on health care production. Those facilities that were granted the freedom to spend their MoH base grant .. increased their output relative to other facilities in the sample’ www.qmu.ac.uk/iihd

  9. Our methods • Participant observation, inception workshop July 2003 • Participant observation, 3 performance feedback meetings Jan-Feb 2005 • Review of monthly progress reports and other documents • Interviews with hospital management teams in 2 selected hospitals every 6 months • Interviews with the Pilot Implementation Team www.qmu.ac.uk/iihd

  10. Opening up the first box: what is the detail of the programme design? • National workshop July 2003. Major stakeholders invited to 2 day meeting. Pilot explained; pilot districts selected, randomisation undertaken • Baseline survey of outputs by PIT • Selection of survey targets undertaken by participating facility managers at one day meeting; further orientation for managers www.qmu.ac.uk/iihd

  11. Signing of performance contracts • Support and mentoring of HMTs (by PIT) • 6 monthly performance surveillance: check records; measure output volumes for selected targets • Feedback on performance relative to selected targets • Bonus award www.qmu.ac.uk/iihd

  12. What was really implemented? • Funding shortfall. MoH did not provide counterpart funding as DRG expected • Initial activities were undertaken, then long gap while funds for follow up sought • Follow-up (partial) funder changed the design – no support for control group, reduced scope of feedback meetings • Support and mentoring lost to funding shortfalls and priority for measurement activities • Measurements rescheduled to save money – no time to respond to last period performance review • Further ad-hoc changes to design by the PIT www.qmu.ac.uk/iihd

  13. Opening up the second box: how did participants in the programme react? • Implementers cut corners for the sake of time and cost savings • Selecting service targets: No opportunity for prior planning with full facility management team; 2-3 members of the hospital management team including a member of the Board of Governors given a few hours in the one day meeting to make this choice • Implementers changed the rules and refused to allow managers to change the targets for the second year

  14. Implications • Lack of strategic choice in selecting targets • Lack of communication of programme to other members of staff in hospitals www.qmu.ac.uk/iihd

  15. ‘In certain units we would find a new management team with no information at all about the study .. I found out that some people who had been sensitized had moved out of the health units.’ • ‘we selected .. I think OPD (looks up the file and reads from it) ooh no! … yeah I wish we had selected OPD. We selected maternal deliveries, immunization and malaria treatment …’ • malaria, there is this home based management of fevers (new program) that we did not factor in at the start of PBC. We thought the malaria will always be there but it was not to be. So I really don’t know how we can treat 10% more malaria at this hospital’

  16. The perspective of the control group • After the first round a meeting of stakeholders was convened to feed back on performance and award those who earned bonuses with their checks. • All facilities (from the control; autonomy and bonus groups) received feedback on their performance • ‘YYY hospital got 10 million and yet zzz hospital is doing far much better. It’s frustrating and lost meaning – done better but no bonus' www.qmu.ac.uk/iihd

  17. ‘I think the lack of cooperation was that some units were not involved … they also did not appreciate the issue of randomization. All they needed was to be in the bonus group’ (PIT member) www.qmu.ac.uk/iihd

  18. Lack of communications within hospitals • Delays after inception of the programme in appointing coordinator and releasing funds • Lack of institutional memory within hospitals by time programme really started • Some hospital managers deliberately kept information about programme from staff – not clear why. www.qmu.ac.uk/iihd

  19. ‘we both attended the meeting at the district and we chose these targets … all along I had kept silent about this but when the bonus came, I reminded them about the study and announced the bonus’ • When bonuses were announced, different approaches were taken to deciding on their use • Staff appreciated parties from which all shared in the hospital’s success • They accepted hospital improvements as a good use of funds, where they were consulted • They disliked any attempt to reward individuals according to their contributions www.qmu.ac.uk/iihd

  20. Opening up the third box: What performance was really measured? • Primary registers instead of HMIS reports were used for performance verification – attributed to the fear that the aggregated HMIS reports were vulnerable to manipulation (deviation from contract) • Major workload implications for the PIT • PIT concurrently tasked to collect additional data to measure impact of the intervention: household surveys, survey of organisational capacity; client exit polls; count other service outputs to assess change of case mix www.qmu.ac.uk/iihd

  21. Implications • Contract relevant performance verification measurement crowded out by additional data collection • Reliability of measured output volumes compromised. PIT team were not familiar with clinical shorthand and recording practices in clinical registers • ‘Being able to make correct recording improved the numbers (outputs) in the second round. For some, it was improving the handwriting so that we can understand…some use ‘PF’ for a diagnosis of malaria. But now we are more used to these abbreviations’

  22. DRG insight into why their intervention didn’t work • ‘Why has the performance bonus not worked? One can imagine a number of possible explanations. First, perhaps the performance bonuses were not large enough. … Second, the performance bonus was paid to the facility and not to the individual providers directly. … Third, it is possible that the performance-based contract was too difficult to manage. … Finally, it is possible that the experiment has not had long enough to take effect.’ [10 page 31]. www.qmu.ac.uk/iihd

  23. Conclusions 1 • Experiment produced little benefits for hospitals or patients – may have done harm • DRG learned very little from the research they conducted – they did not understand • what they had implemented • why it produced the results it did www.qmu.ac.uk/iihd

  24. Conclusions 2 • All programmes depend on their implementation detail – performance based payment is not a simple single measure • Needs to be a clear understanding of the mechanisms of effect by which outcomes should be achieved – any break in the chain will prevent the mechanism working • Good evidence helps to resolve problems and get interventions to work better – not helpful to know it ‘works’ or not www.qmu.ac.uk/iihd

More Related