410 likes | 545 Views
Changing Organizations through Feedback on Quality. R. Adams Dudley, MD, MBA Institute for Health Policy Studies University of California, San Francisco
E N D
Changing Organizations through Feedback on Quality R. Adams Dudley, MD, MBA Institute for Health Policy Studies University of California, San Francisco Support: Agency for Healthcare Research and Quality, Robert Wood Johnson Foundation Investigator Award in Health Policy, California Health Care Foundation
Outline of Talk • Why So Much Pressure to Measure Performance? • Do Providers Respond to Performance Data? • Generating and Benchmarking Performance Data: The CHART Example • Conclusions
Cost Pressures – No End in Sight Source: Kaiser/HRET Survey of Employer-Sponsored Health Benefits; 2003. Dental work by Arnie Milstein, MD. Note: Data on premium increases reflect the cost of health insurance premiums for a family of four.
The Invisible Problem: Quality Shortfalls Source: McGlynn, et al., NEJM 2003, 348:2635-45
OR MAYBE NOT SO INVISIBLE:Range of performance on risk-adjusted ICU mortality rate (observed/expected) among CHART hospitals seems to be wide 3-fold difference in mortality between redoval and green oval hosps. Source: www.CalHospitalCompare.org, run by the California Hospital Assessment and Reporting Taskforce (CHART), CHCF, and UCSF
So What Is a CEO to Do? • US business leaders: • 1) thought about how they influence behavior in their own industries --> focus on incentives to achieve desired outcomes • 2) decided they could only tackle the problem collectively --> national movement to create “XYZ Business Group on Health”
Some Key Components of Providers’ Initial Responses to Business Leaders • “We’re professionals, we don’t respond to incentives.”
Incentives: Question #1 • Outcome variables: • Are Vanderbilt pediatrics residents present for well-child visits for their patients? • Do they make extra trips to clinic when their patients have acute illness • Intervention: randomize them to receive (in addition to their usual salary) either: • $2/visit scheduled • $20/month for attending clinic • What will happen???
Incentives: Question #1 • Answer: Hickson et al. Pediatrics 1987;80(3):344 • $2/visit-incentivized residents did better on both measures
Using Reputational Incentives-Question #2 • Outcome variables: • Do US hospitals engage in quality improvement activities • Do pts change hospitals • Intervention: • HCFA (the old name for CMS) releases a report showing each hospitals overall mortality rate • What will happen???
Using Reputational Incentives-Question #2 • Answers: • Hospital leaders said they didn’t use the data because they thought it was inaccurate, though there was a slight chance hosps rated as doing poorly would use data • Not much impact on bed occupancy for hosps in NY
Using Reputational Incentives: Question #3 • Outcome variables: • Do Wisconsin hospitals engage in quality improvement activities in obstetrics • Intervention: three groups in this study: • Public report of performance aggressively pushed by local business group to the media and employees, big focus on making the data understandable to consumers • Confidential report of performance • No report at all • What will happen??? Hibbard et al. Health Affairs 2003; 22(2):84
Average number of quality improvement activities to reduce obstetrical complications: Public report group has more QUALITY IMPROVEMENT (p < .01, n = 93) Best practices around c-sectionsBest practices around v-bacs Reducing 3rd or 4th degree laceration Reducing hemorrhageReducing pre-natal complicationsReducing post-surgical complicationsOther
Hospitals with poor OB scores: Public report group have the most OB QI activities (p = .001, n = 34)
Hospitals with poor OB score: Public report group have more QI on reducing hemorrhage –a key factor in the poor scores (p < .001, N=34)
Using Reputational Incentives (Public Reporting of Quality):Preliminary Conclusion • Reputational incentives work! • …except when the don’t!
Provider Incentive Environmental variables: General approach to payment; regulatory and market factors Design of the Incentive Program: • Financial characteristics (e.g., revenue potential, cost of compliance) • Reputational aspects (e.g., extent of efforts to market data to patients and peers) • Psychological dimensions (e.g., salience of quality measures to provider’s practice) Provider group Predisposing/Enabling factors Organizational factors (if applicable, e.g., the organization’s internal incentive programs or information technology) Provider decision-maker Patient factors (e.g., education, income, cost sharing) Provider response: change in care structure or process • Change in outcomes: • Clinical performance measures • Non-financial outcomes for the provider (e.g., provider satisfaction) • Financial results for the provider Source: Frolich et al. Health Policy, 2007; 80(1):179
Reasons to Create Clinical Data and Provide Feedback • Clinicians need data to understand their performance and what is achievable • There is growing interest in using performance measurement as part of public reporting and pay-for-performance (reputational and financial incentives)
Performance Measurement: A Real World Example • The California Hospital Assessment and Reporting Taskforce (CHART)
Participants in CHART All the stakeholders: • Hospitals: e.g., HASC & CHA, hospital systems, individual hospitals • Physicians: e.g., California Medical Association • Nurses: ANA-C • Consumers/Labor: e.g., Consumers Union/California Health Care Coalition • Employers: e.g., PBGH, CalPERS • Health Plans: Aetna, Blue Shield, CIGNA, HealthNet, Kaiser, PacifiCare, Wellpoint • Regulators: e.g., JCAHO, OSHPD, NQF
Goals of CHART • To develop an agreed upon measure set • To increase the standardization of measures (across hospitals and with JCAHO, CMS, etc.) • To provide high quality data management and reporting • To provide and maintain transparency in hospital performance ratings
How Labels and Icons Were Developed • Formal focus testing with consumers and industry representatives • Most accurate choice + qualitative comments • Color coded icon with word in the center • RMAG review – No formal recommendation to Steering Committee • Steering Committee Discussion
Initial Steering Committee Principles • More than just the usual 3 groups (average, above, below, using 95% CIs) • Consider alternative approaches – cluster methodology analysis, multiple benchmarks • Let the data dictate how many groups are created (upper limit 5) • No ranking, not even quintiles • Use confidence intervals (sample size)
The Process • Engage well known biostatistician with strong public reporting experience • Create a work group to interface with biostatistician
Eventual Steering Committee Decision After hearing from the biostatistician that the multiple benchmark approach (see next slides) was valid, the Steering Committee decided: • Use multiple benchmark approach • Use national (meaning JCAHO/CMS/HQA) benchmarks when available, use California benchmarks when necessary • No upper or lower thresholds (except any performance ≥98% will always be consider in the top group, even if national benchmarks are 99% or 100%)
Eventual Steering Committee Decision The Multiple Benchmark approach: • Choose 3 clinically relevant benchmarks • Compare hospital performance not just to mean or expected performance, but to all three benchmarks • For each hospital, estimate the interval within which we believe the hospital’s performance is most likely to fall (e.g., “Hospital X administers thrombolytics within 30 minutes to patients having an acute myocardial infarction between 58% and 69% of the time”) • Ask which of the benchmarks this interval includes • This can result in more than the usual 3 groups of “above expected/above average”, “expected/average”, and “below expected/below average”
Public Reporting of Hospital Performance: A Voluntary Initiative www.CalHospitalCompare.org (for public report of hospital performance in California) CHART.ucsf.edu (to see how we manage CalHospitalCompare)
PBGH & Where to Get More Information To Learn More… www.pbgh.org— an overview of PBGH programs and initiatives www.HealthScope.org — consumer Web site with health plan and provider quality measurements www.PacAdvantage.org — small group purchasing pool http://chooser.pacadvantage.org — sample site to assist enrollees in plan selection To subscribe to the PBGH E-Letter, go to www.pbgh.org/news/eletters
Leapfrog’s Hospital Rewards Program Changes • A revised Leapfrog Hospital Rewards Program™ (LHRP) will be based solely on Leapfrog survey data • Leapfrog hospital survey results also reported to the public • Leapfrog assesses performance based on (summarized): • Use of computerized order entry • Availability of intensivist physicians • Hospital volume (or risk-adjusted mortality) for several conditions • Responses to survey questions about implementation of several “safe practices” Source: Karen Linscott, COO Leapfrog 3
Performance Measurement and Incentives:A Global Phenomenon Source: McNamara, P. Intl J Qual Hlth Care. 2005, 17(4):357
Conclusions • Performance varies widely in many domains in clinical medicine • Without performance measurement, providers cannot know how they are doing, nor can transparency be achieved or incentives used • CHART is one example of a multistakeholder process leading to novel approaches to performance assessment • Experimentation is fine, change is inevitable
Please visit • www.CalHospitalCompare.org • (for public report of hospital performance in California) • CHART.ucsf.edu • (to see how we manage CalHospitalCompare)