310 likes | 316 Views
Does publicly reported stroke data identify which institutions provide better stroke care?. Adam G. Kelly, MD May 20, 2010. Disclosures. Financial disclosures: None Unlabeled/unapproved uses disclosure: None. Disclosures. Financial disclosures: None Unlabeled/unapproved uses disclosure:
E N D
Does publicly reported stroke data identify which institutions provide better stroke care? Adam G. Kelly, MD May 20, 2010
Disclosures • Financial disclosures: • None • Unlabeled/unapproved uses disclosure: • None
Disclosures • Financial disclosures: • None • Unlabeled/unapproved uses disclosure: • None • Other disclosures: • Faculty member/attending neurologist at Strong Memorial and Highland Hospitals
Outline • Describe the current state of publicly reported stroke quality data • How much data are available to the public? • What is the content of publicly available data? • Evaluate the utility of current data • What are the important attributes to consumers of publicly available data? • Conclusions and recommendations
History of public reporting • Little reporting of outcomes until the 1980s • Release of hospital-based mortality for all Medicare patients, and those admitted with 9 specific diagnoses or procedures in 1986 • Steady increase in the amount of publicly available outcome data • Competition for limited patient pool • Internet availability
History of public reporting • Much of publicly available data is surgical in nature; data for common medical conditions (MI, pneumonia, CHF, etc.) has been growing • Stroke would appear to be a prime candidate for public reporting: • Large public health burden • High morbidity and mortality • Available and validated performance measures
Quality data for stroke • What is the current amount of publicly available stroke quality data? • Data source: Agency for Healthcare Research and Quality (AHRQ) Report Card Compendium • First released in November 2006 • Updated periodically • Free and publicly available
Results • 221 report cards were included in the AHRQ Compendium as of Spring 2008. • 16 report cards were not accessible • From these 205 report cards, 19 (9%) reported data on stroke quality • 17 reported hospital-based data • 16/17 sites combined data for ischemic stroke, intracerebral hemorrhage, and subarachnoid hemorrhage
Quality data for stroke • What is the content of quality data contained in the report cards reporting stroke data? • 5 separate categories of data: • Outcomes • Process • Structure • Utilization • Financial
Results • 17 report cards presented hospital-based stroke quality data • Utilization measures were the most frequently reported type of data (15 sites): • Case volumes, lengths of stay (risk-adjusted) • Outcome measures were reported by 14 sites: • Mortality rates (inpatient out to 180 days), complication rates, re-admission rates • All risk-adjusted
Results • Financial measures were reported by 4 sites: • Costs, charges • Structure measures were infrequently reported (2 sites): • Presence of dedicated stroke unit, number of beds in stroke unit, Joint Commission Stroke Center or state Primary Stroke Center designation
Results • Process measures were reported by a single site (based in UK, not USA): • Use of CT scans, review of cases by neurologist, arrangement for therapy at time of discharge • Patient/family satisfaction with care was not reported by any sites
Summary • Few publicly available report cards provide stroke quality data • Available stroke quality data largely consists of administrative data (utilization, mortality, financial data)
Utility of publicly available data • What are the important attributes to consumers of publicly available data? • Timeliness • Reliability • Sensitive to change in hospital performance • Able to discriminate high and low-performing hospitals • Validity • How do current report cards measure up in these areas?
Timeliness • Currently available quality data ranges in age from 1-3 years old • Some sites use data from 4-5 years prior • Does data in this timeframe truly reflect current hospital performance? • Should strive for making data as real-time as possible
Reliability • 14 report cards provide ratings based on mortality – do they agree on hospital performance? • Frequent disagreement has been noted in surgical report card ratings, though not quantified • What is the agreement rate for report card ratings for stroke care at all New York State hospitals?
Reliability • 157 out of 214 NYS hospitals were evaluated by two separate report cards • Non-profit agency, for-profit corporation • Both report cards use a 3-tiered rating system to evaluate inpatient stroke mortality • One compares hospital mortality to state average; other compares observed hospital mortality to risk-adjusted expected mortality • Ratings were congruent (in agreement) for only 61% of hospitals
Reliability • Results from other states with two ratings of inpatient mortality: • Texas – agreement rate 74.3% • Pennsylvania – agreement rate 67.4% • Massachusetts – agreement rate 50%
Reliability • What are the reasons for poor agreement on hospital mortality ratings? • Differences in populations • Different risk-adjustment techniques, statistical techniques • What are the implications of poor agreement? • Poor trust amongst patients/consumers
Sensitivity to change • How frequently do mortality-based hospital ratings change over time? • 20% of NYS hospital ratings changed over one year timeframe • Only 5% of hospitals had their ratings change by both evaluating systems • Are these changes indicative of: • Change in hospital performance? • Change in patient population? • Change in methods of evaluation?
Discriminative ability • Can currently available stroke quality data discriminate low and high-performing hospitals? • Mortality-based hospital ratings based on 95% confidence intervals, comparisons to expected mortality rates • Ratings may be more sensitive for high-volume hospitals; limited ability to discriminate performance in low/medium volume institutions
Discriminative ability Case volume 95% confidence interval Mortality rate Highland, Park Ridge, and Strong Memorial Hospitals are all assigned 2-star (average) ratings; Rochester General Hospital is assigned 1-star (below average)
Validity • Difficult to determine which quality measure is most valid • Unclear which measure is most important to patients, public • What are we hoping to accomplish with public reporting of quality data? • If goal is to increase transparency and better inform patient decisions: • Limited role of financial data • Uncertain role of utilization data
Is mortality a valid measure? • Has many desirable aspects of an endpoint • Definitive/objective • Quantifiable • Clinically relevant • Easily accessible • Should be easily comprehended
Is mortality a valid measure? • Should diseases with markedly different mortality rates be combined? • Ischemic stroke: 8-12% • Intracerebral hemorrhage: 37-44% • Subarachnoid hemorrhage: > 50% • Does it correlate with structure/process of care? • Differences in adherence to performance measures explains < 10% of variations in mortality rates • Distribution of mortality ratings no different among 107 NYS Designated Stroke Centers
Is mortality a valid measure? • Is it a marker of unsafe care? • Unsafe practices are implicated and potentially responsible for < 10% of short-term mortalities • Or is it a marker of patient/family preferences? • Majority of in-hospital deaths on a neurology service are due to patient/family preference to withdraw care • Does it send the correct message? • Reinforces the concept that death is universally and unconditionally a negative outcome
Conclusions • Publicly available stroke quality data is limited in its ability to identify high-performing stroke centers due to: • Narrow scope • Over-reliance on utilization and other administrative data • Lack of real-time data • Inconsistency across multiple sites • Inpatient mortality may not be the most appropriate marker of quality stroke care
Recommendations • Provide separate measures for ischemic stroke, intracerebral hemorrhage, and subarachnoid hemorrhage • Develop methods of reporting data on a more timely basis • Increase the skepticism on mortality as a primary measure of quality care • Separate deaths due to unsafe practices from those due to patient/family preference
Recommendations • Develop a standard set of process measures to be tracked and reported • Harmonized with pre-existing measures recommended by Brain Attack Coalition, Joint Commission, and state-specific guidelines • Examples: IV t-PA consideration/utilization, use of anti-platelets, use of warfarin for AF, DVT prophylaxis, etc.
Recommendations • Incorporate patient/family satisfaction into publicly reported data • Encourage mandatory reporting of all measures