1 / 64

Benchmarking CDI: A Study of 37 Academic Medical Centers

Benchmarking CDI: A Study of 37 Academic Medical Centers. Kathy Vermoch, MPH Project Manager, Quality Operations University HealthSystem Consortium vermoch@uhc.edu Donald A. Butler, RN, BSN Manager, Clinical Documentation Improvement Pitt County Memorial Hospital, Greenville, NC

esheila
Download Presentation

Benchmarking CDI: A Study of 37 Academic Medical Centers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Benchmarking CDI: A Study of 37 Academic Medical Centers Kathy Vermoch, MPH Project Manager, Quality Operations University HealthSystem Consortium vermoch@uhc.edu Donald A. Butler, RN, BSN Manager, Clinical Documentation Improvement Pitt County Memorial Hospital, Greenville, NC dbutler@pcmh.com

  2. Session Objectives Receive highlights from the UHC Clinical Documentation Improvement (CDI) Benchmarking Project Learn about successful approaches used by academic medical centers (AMCs) to promote the accuracy of clinical documentation and coded data Discover how one AMC used project data to identify and address documentation and coding improvement opportunities

  3. What Is UHC? The University HealthSystem Consortium (UHC), Oak Brook, Illinois (1984) is an alliance of: 112 academic medical centers 256 affiliated hospitals 80 faculty practice groups Mission: To advance knowledge, foster collaboration, and promote change to help members succeed in their respective markets Vision: To be a catalyst for change, accelerating the achievement of clinical and operational excellence

  4. UHC Members in Florida • Cleveland Clinic Hospital (Weston) • Jackson Health System (Jackson Memorial Hospital) • Jackson North Medical Center • Jackson South Community Hospital • Mayo Clinic Jacksonville • Shands HealthCare (Shands at the University of Florida) • Shands Jacksonville • Shands Lake Shore • Shands Live Oak • Shands Starke • University of Florida College of Medicine, Faculty Group Practice • Tampa General Hospital • University of Miami Medical Group • University of South Florida Physicians Group Boldface: AMC members – Plain text: Associate members – Italics: Faculty practice groups

  5. CDI Project Goals • The steering committee set the following project goals: • Evaluate clinical documentation practices • Identify critical success factors for documentation quality • Learn about CDI performance measures used by AMCs • Communicate successful CDI strategies

  6. Project Participation and Data • In 2009, 37 AMCs completed a CDI survey • UHC Clinical Data Base (CDB) data were also used to: • Evaluate reporting of comorbidities and complications (CCs and major CCs) • Assess documentation quality • Evaluate data quality • Present on admission (POA) reporting • 41 innovative strategy reports submitted • 4 better-performers interviewed • In 2010, the UHC Clinical Documentation Improvement Collaborative was conducted

  7. Benchmarking Project Participants • Denver Health • Hennepin County Medical Center • IU Health (Clarian) • MCG Health, Inc. • Medical Univ. of South Carolina • Mount Sinai Medical Center • NYU Langone Medical Center • New York-Presbyterian Hospital • North Carolina Baptist Hospital (Wake Forest) • Northwestern Memorial Hospital • Olive View-UCLA Medical Center • Oregon Health & Science University Medical Center • Penn State M.S. Hershey Medical Center • Shands at the University of FL • Stanford Hospital and Clinics • Stony Brook Univ. Medical Center • SUNY Downstate Medical Center/University Hospital • Tampa General Hospital • The Methodist Hospital System • The Ohio State University Med. Ctr. • The University of Michigan Hospitals and Health Centers • UC Davis Medical Center • UC Irvine Medical Center • University Health Systems of Eastern Carolina (PCMH) • University Hospital of the SUNY Upstate Medical University • University Hospitals Case Med. Ctr. • University Medical Center • University of Arkansas for Medical Sciences (UAMS) Medical Center • University of Kentucky Hospital • University of Maryland Medical Center • University of Missouri Health Care (University Hospital) • University of North Carolina Hospitals • University of Pennsylvania Health System (Hosp. of the Univ. of Penn.) • University of Toledo Medical Center • Vanderbilt University Medical Center • Virginia Commonwealth University Health System (VCUHS) • Wishard Health Services

  8. Project Highlights

  9. CDI Program Profile • 44% of participants are part of multihospital systems • Of these, 50% have implemented standardized clinical documentation assurance practices across the system • 32 of 37 (86%) have a formal CDI program; of these: • 59% in place > 3 years, 31% > 5 years • 59% of the programs focus on Medicare patients

  10. Oversight of CDI Programs Department Responsible for the CDI Program

  11. CDI Staffing • Clinical documentation specialists are nurses (69%), coders (25%), and mixed (6%) • Dedicated nonsupervisory CDS FTEs = 5.4 mean (range: 1.0–11.5) • Mean, annual inpatient discharges per CDS FTE = 7,914 (range: 2,103–21,709) • Dedicated supervisory CDS FTEs = 0.9 mean (range: 0.0–2.0) • 28% have paid, dedicated CDI physician FTEs with a mean of 0.5 MD FTEs (range: 0.3–1.0) • All/most CDI staff are hospital employees for 100% of respondents

  12. CC and MCC Capture Rate Report Benchmarks CC and MCC capture by service line across UHC’s AMC members and suggests possible revenue opportunities

  13. Millions in Revenue May Be Missed • Median $3.4 million Medicare revenue opportunity if reporting CCs and MCCs at top quartile of UHC’s AMC members: • Range $462,000 to $21.1 million • Average $4.3 million • Medicare revenue opportunity doubled after MS-DRGs implemented Source: UHC Clinical and Financial Data Base for discharges from 89 AMC members

  14. Data Quality • CDB data analysis shows 15% of project participants have POA data quality issues: • POA indicator uncertain for > 10% of records • Nonexempt POA specified as exempt for > 5% of records • POA indicator not reported 97% responded on survey that POA is coded for all payers and patients

  15. Coding Profile Report CDB report that benchmarks key data elements for possible documentation and coding opportunities

  16. Data Analysis • Medical CMI was < 1.0 for 7.4% of AMC hospitals • MS-DRG 313 (chest pain) was in the top 20 MS-DRGs based on case volume for 55% of hospitals • More than 5% of cases in MS-DRGs w/o CC/MCC had LOS >= 95th percentile in 86% of hospitals • Admit source was missing/unknown for > 2% of cases in 1% of hospitals (range: 0–20%) • Admit status was missing/unknown for > 1% of cases in 5% of hospitals (range: 0–8.5%) Source: UHC CDB data, CY 2009, 3 million discharges from 95 AMCs

  17. Data Analysis • Race was missing/unknown for > 2% of cases in 2% of hospitals (range: 0–88%) • More than 0.5% of AMI cases were coded 410.X0 in 41% of hospitals (range: 0–5%) • More than 1% of AMI cases were coded 410.X0 in 26% of hospitals • More than 0.5% of cases with mechanical vent were coded 96.70 in 9.5% of hospitals (range: 0–2%) • Percentage of cases with any AHRQ comorbidity reported ranged from 45% to 89% Source: UHC CDB data, CY 2009, 3 million discharges from 95 AMCs

  18. CDI Bundled Score • A formal clinical documentation improvement program has been implemented • Documentation clarification query response rates are tracked • Strong likelihood of querying providers for documentation clarification for: • Patients with an extended LOS and a SOI value of < 3 • The unexpected death of patients with a ROM value of < 3 or 4 • Inadequate documentation of the patient’s history, demographics, or admission, discharge, and transfer status • Inadequate documentation of medical necessity

  19. Self-Assessment Score

  20. Key Performance Measures Source: UHC Clinical Data Base analysis and survey results

  21. Better-Performing Organizations • IU Health (IN) • The Methodist Hospital System (TX) • University Medical Center (AZ) • Virginia Commonwealth University Health System (MCV Hospital)

  22. Clinical DocumentationCritical Success Factors • Provide strong medical and administrative leadership support • Communicate and collaborate • Perform robust data tracking and mining • Offer ongoing education • Implement an effective clinical documentation improvement program

  23. Strong Medical and Administrative Leadership Support • Resources: Invest in CDI FTEs, training, hardware and software, tools and databases, consultants, auditors, etc. • Physician leaders: Gain buy-in, provide peer-to-peer education, and offer documentation clarification • Accountability: Hold providers responsible for compliance • Informed decision-making: Use high-quality data for planning andperformance improvement

  24. Communicate and Collaborate • Coders and CDS staff meet often for education, data analysis, and clarification • CDS have open, consistent communications with doctors • Physicians take part in designing and implementing documentation improvement initiatives • CDI communicates and collaborates with all areas that use coded data

  25. Perform Robust Data Tracking and Mining • Track query response rates and share data with medical leaders to hold providers accountable for compliance • Analyze query data for educational opportunities • Analyze coded, clinical data to benchmark performance and surface improvement opportunities

  26. Offer Ongoing Education • Share timely, service-specific examples of the impact of documentation and coding • Provide easy, convenient point-of-care tools • Do not over-emphasize reimbursement revenue issues with doctors • Use clear, specific language and examples

  27. Implement an Effective CDI Program • Hire/train expert CDI staff to conduct concurrent chart reviews • Educate doctors about the impact on quality data • Implement practices that support documentation improvement • Hold providers accountable for compliance with requirements • Benchmark performance and communicate data

  28. Performance Opportunity Summary Report card of each organization’s performance on key performance measures

  29. Self-Assessment Scorecard Questions derived from project findings and better-performer interviews

  30. Benchmarking project findings and resources were communicated 34 organizations completed a gap analysis Participants selected their teams Collaborative workgroups formed Initiatives launched locally Monthly networking conference calls for 6 months Web conferences conducted Clinical Documentation Improvement Collaborative

  31. Collaborative Participants • Fletcher Allen Health Care • Hennepin County Medical Center • IU Health • New York-Presbyterian Health System • Northwestern Memorial Hospital • NYU Langone Medical Center • Oregon Health & Science University • Parkland Health & Hospital System • Penn State M.S. Hershey Medical Center • Riverside County Regional Medical Center • Shands at the University of Florida • Shands Jacksonville • Stanford Hospital & Clinics • Stony Brook University Medical Center • SUNY Downstate Medical Center/University Hospital • The Ohio State University Medical Center • Thomas Jefferson University Hospital • UC Davis Medical Center • University Health Systems of Eastern Carolina (Pitt County Memorial Hospital) • University Hospitals Case Medical Center • University Medical Center • University of Arkansas for Medical Sciences (UAMS) Medical Center • University of Kentucky Hospital • University of Michigan Hospitals & Health Centers • University of Mississippi Health Care • University of Missouri Health Care (University Hospital) • University of Pennsylvania Health System (Hospital of the University of Pennsylvania) • University of Rochester Medical Center • University of Texas Medical Branch, Galveston • University of Washington Medical Center • UNM Hospitals • Upstate University Hospital • Virginia Commonwealth University Health System (VCUHS) • Wishard Health Services

  32. Examples of Collaborative Achievements • UC Davis: Improved concurrent query response rate from 46% to 76% • UH Case: Improved time for initial CDS chart review (Goal: Working DRG within 24 hours) • Stony Brook: Implemented a mortality review process for deaths with “less than extreme ROM” • TJUH: Enhanced retrospective query procedures, improved response rates from 82% to 100% • OHSU: Revised query policies and templates, developed a training program for new CDI staff • Stanford: Expanded mortality reviews and achieved a 7% increase in extreme ROM cases • Shands UF: Developed a CDI mission statement and selected CDI performance measures

  33. Using Data to Create Change • Next, Don Butler from the University Health Systems of Eastern Carolina (Pitt County Memorial Hospital) • To demonstrate how his CDI team acted on project data to validate and address documentation and coding improvement opportunities The success of benchmarking comes from implementation, not the data

  34. Drilling for Answers:CT Surgery Documentation Improvement Project

  35. Overview • Present a detailed overview of all phases of this project as an example of data-driven performance improvement • Trigger report • Team formation & analysis • Findings & planning • Actions • Results • Next?

  36. The Trigger: UHC Report Sparks Interest What about these? 37

  37. The Trigger: UHC Report Sparks Interest • Several areas of potential interest – CT surgery & vascular • Cardiology & gen surgery don’t seem to have the same likely initial opportunity 38

  38. The Trigger: UHC Report Sparks Interest 39

  39. Defining the Opportunities:Select a Team & Conversations • Members of the cardiovascular measurement team and the clinical documentation improvement team were tasked to explore UHC report • Initial conversations focused on: • Possible statistical reasons • Ratio valves vs. CABG • Ratio with vs. without cath • Impression: question will be to quantify (not validate need for improvement) • Reviewed & acknowledged improvements obtained with CT surgery consult/H&P form • Perform further analysis: • Data analysis (UHC) • Focused chart review (based on STS database) • CDI experiences 40

  40. Defining the Opportunities:First Cut: Procedure Grouping Focus • Largest volume CABG & valves • 77% of all cases • Lessons learned ought to translate across to other procedural groups 41

  41. Red dot indicates ECHI performance during UHC study; graphed are data Oct. to Feb. ’08 Note real improvements in capture of CC/MCC with valve cases Defining the Opportunities:Second Cut: 2 Procedure Focus 42

  42. U.S. and NC, significant difference with ECHI However, U.S. and NC are close Defining the Opportunities:Third Cut: CABG & Valves vs. NC & U.S. 43

  43. Similar correlations when looking at SOI measures(for 2 NC centers A & B) Defining the Opportunities:Validate: APR DRG SOI 44

  44. Mortality Index – NC Comparison A B C

  45. Defining the Opportunities: Prevalence CABG MCCs Across Institutions • Selected 2 NC peers and drilled down to ICD-9 level (CABGs) • 4 most prevalent MCC diagnoses: REAL difference on frequency of capture • Caveat: NSTEMI likely mix PDX & ODX 46

  46. Defining the Opportunities:Clinical Review • Case selection • All reviewed cases were from the provided STS database for cardiothoracic surgical cases • A total of 129 cases were reviewed from a 5-month period identical to analysis above • Selection was based on the presence of STS data elements AND those cases that were without CC or MCC: AMI 1–7 days; HF; renal failure; prolonged vent; asthma/COPD; confusion; ptx/hemothorax; op/reop bleed/tamponade; 8 additional elements with infrequent occurrence • There was not as strong of a correlation as might have been expected between the STS data and the documentation and clinical indicators seen in the chart, specifically for HF and renal failure • Cannot extrapolate results to all CABG & valve cases due to selection criteria 47

  47. Possible additional revenue @ a varied confidence level for obtaining diagnosis by MD query 50% $126,940.20 75% $190,410.30 100% $253,880.40 Defining the Opportunities:Clinical Review Findings • Clinical indicators were identified to support an MD query: • 12% of cases for CC/MCC • >7% for additional severity diagnosis • Again, cannot extrapolate 48

  48. Defining the Opportunities:Bonus: Physician Rates UHC Mean UHC Mean 49

  49. Defining the Opportunities:MD (& PA/NP) CDI Behaviors • MD response with CDI activities (FY08) • CDI concurrent reviews FY08: • 1335 CT surgery service line cases • 84% reviewed • 8% of reviews achieved confirmed improvements • Physician advisor reviews (new program): • Reviewed 103 cases (screened selection of cases) • 19% of cases (20) identified for a recommended query 50

More Related