1 / 33

reliability of hospital-abstracted data: a comparison with cdac abstraction

7SOW Measurement. Two parallel measurement processes:State-level Surveillance: CMS Task 1c monitoring through a random sample representative of Medicare dischargesHospital-level tracking: Task 2b, data contributed through ORYX Core Measures and/or CART to the QNet Exchange Clinical data repository.

Download Presentation

reliability of hospital-abstracted data: a comparison with cdac abstraction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Reliability of Hospital-Abstracted Data:A Comparison with CDAC Abstraction Andrei Kuznetsov, MA MissouriPRO

    3. 7SOW Measurement Two parallel measurement processes: State-level Surveillance: CMS Task 1c monitoring through a random sample representative of Medicare discharges Hospital-level tracking: Task 2b, data contributed through ORYX Core Measures and/or CART to the QNet Exchange Clinical data repository

    4. Measurement processes linked Surveillance - 1c CMS includes record in a surveillance sample If Yes, use the electronic record from the Repository Repository - 2b Hospital contributes an abstracted record Hospital may save money by not having to copy paper chart

    5. 2b: Accuracy Required 7SOW RFP: Hospitals that consistently perform below 80 percent reliability… will be required to provide hardcopy versions of charts when data are requested… Those hospitals performing at or above 80 percent reliability will be allowed to submit electronic versions of abstracted data when requested. Operationalize accurate abstraction: agreement with ‘gold standard’=CDAC

    6. A test case: ORYX Pilot MO was one of the 5 states in the Pilot of ORYX Core Measures HF, AMI and Pneumonia discharges (CY 2001) were reviewed by hospitals JCAHO used 6SOW inclusion/exclusion criteria for the Pilot

    7. ORYX Pilot in Missouri 18 hospitals took part AMI and Pne abstraction tools created by MissouriPRO in MedQuest collected all information for the official 6SOW indicators on 2 sides of one sheet HF abstraction tool designed by MPRO (Michigan QIO) as a modification the national tool (NHF)

    8. ORYX Pilot: Steps to ensure accuracy Training in use of abstraction tools was conducted up front A support hotline was operated IRR testing was conducted as a condition of admitting an abstractor Kappa=0.4 used as a threshold New abstractors had to submit to IRR

    9. Comparison with CDAC data For Pne (155 charts) and AMI (135 charts), comparisons can only be made at the level of a Numerator/Denominator for an indicator: Example AMI QI-1 Denominator: Eligible for ASA at admission? Hospital: Yes CDAC: Yes => Agreement AMI QI-1 Numerator: Received ASA at admission? Hospital: Yes CDAC: No => Disagreement

    10. Comparison to CDAC, cont’d For 49 variables in the HF module, comparisons could be made directly between CDAC data and hospital-generated data Data available on 90 HF charts

    11. Compare abstraction results: AMI CDAC Provider QI-1: ASA at admission 84% 89% QI-2: ASA at discharge 91% 79% QI-3: Beta Blocker at admission 61% 79% QI-4: Beta Blocker at discharge 79% 79% QI-5: ACEI at discharge 73% 76% QI-6: Smoking cessation counseling 35% 37% Data: 155 AMI charts reviewed by CDAC and providers

    12. Compare abstraction results: HF CDAC Provider QI-1: Appropriate use of ACEI at discharge 87% 86% QI-2: Appropriate use of ACEI or ARB at disch. 88% 87% QI-3: EF Evaluated before or during admission for pts not admitted on ACEI/ARB 73% 63% QI-4: Discharge on ACEI or documentated reason for no ACEI Rx for pts with LVSD not admitted on ACEI/ARB 60% 59% Data: 90 HF charts reviewed by CDAC and Providers

    13. Compare abstraction results: Pne CDAC Providers QI-1: Antibiotic within 8 hours 91% 96% QI-2: Antibiotic consistent with rec’s 79% 88% QI-3: Blood cultures before antibiotics 91% 75% QI-5: Pneumococcal immunization screening 41% 47% Data: 155 charts reviewed by CDAC and Providers

    14. But agreement is low Heart Failure - 90 charts, 49 variables Kappa=0.40, exact agreement=84% Pneumonia - 155 charts, 8 measures Kappa=0.52, exact agreement=86% AMI - 135 charts, 12 measures Kappa=0.46, exact agreement=80%

    15. Method of further analysis Separated agreement on denominator (was patient eligible?) from that on numerator (was treatment received?) Used 2x2 tables: AMI QI-1 Denominator, ASA at admission CDAC: No CDAC:Yes Provider: No 58 1 Provider: Yes 40 36

    16. Tuna, dolphin-safe CDAC: No CDAC:Yes Provider: No 58 1 Provider: Yes 40 36

    17. Disagreements over the denominator status Across multiple indicators... AMI: CDAC and Provider disagreed on denominator status in 31% of cases 28% of 31% were “Provider Dolphins” HF: disagreement in 13% of cases 12% of 13% were “Provider Dolphins” Pne: disagreement in 11% of cases 4% of 11% were “Provider Dolphins”

    18. Disagreements over the numerator status Analyzed only cases where CDAC and Provider agreed that pt was eligible Pne: 14% of cases in disagreement HF: 6% of cases in disagreement AMI: 10% of cases in disagreement =>Disagreements not a huge problem with numerator decisions (plus, the N is smaller)

    19. Working hypothesis Ho: More exclusion rules (screening criteria) => more opportunities for error and disagreement Example: AMI QI-1, ASA at admission has 13 exclusion rules. Exclude case if: transferred from another acute care hospital transferred from another ER UTD admission source allergy to aspirin bleeding on admission, etc.

    20. Exclusion rules - denominator variables Number of % of “CDAC % of “Provider Indicator variable exclusions Dolphins” Dolphins” AMI-1 Den ASA at admission 13 1 30 AMI-2 Den ASA at discharge 18 2 47 AMI-3 Den BB at admission 15 4 22 AMI-4 Den BB at discharge 17 1 33 AMI-5 Den ACEI at discharge 18 3 32 AMI-6 Den Smoking cessation 2 6 1 HF-1 Den 5 3 8 HF-2 Den 4 0 8 HF-3 Den 6 2 13 HF-4 Den 12 0 19 Pne-1 Den 4 9 1 Pne-2 Den 9 10 8 Pne-3 Den 4 4 8 Pne-5 Den 4 5 1

    21. Exclusion rules - numerator variables Number of % of “CDAC % of “Provider Indicator variable exclusions Dolphins” Dolphins” AMI-1 Num ASA at admission 2 3 6 AMI-2 Num ASA at discharge 1 2 0 AMI-3 Num BB at admission 3 0 0 AMI-4 Num BB at discharge 1 0 4 AMI-5 Num ACEI at discharge 1 0 18 AMI-6 Num Smoking cessation 1 11 17 HF-1 Num 9 3 4 HF-2 Num 11 2 5 HF-3 Num 1 6 3 Pne-1 Num 2 3 7 Pne-2 Num 8 2 10 Pne-3 Num 4 14 1 Pne-5 Num 2 6 11 Note: HF-4 Num was omitted because it had only 5 cases eligible for numerator analysis

    22. CDAC Dolphin Catch

    23. Provider Dolphin Catch

    24. Provider Dolphins, cont’d

    25. Hypothesis revisited Ho: More exclusion rules (screening criteria) => higher “Provider Dolphin catch” in determining patient eligibility for treatment No signs of such influence on the numerator status of a case No evidence of impact on “CDAC Dolphin catch” (neither for denominator nor numerator status).

    26. Conclusions - 1 There was no evidence to place the hospitals’ integrity in doubt as far as self-abstracted data are concerned However, public reporting is a whole new bowl of wax

    27. Conclusions - 2 CDAC to Provider agreement rate ran in the 80% to 86% range (but recall our heavy investment into training and abstractor support). It’s likely to be lower without the upfront training and ongoing support

    28. Conclusions - 3 Bulk of the disagreement was over the denominator status For AMI, almost 1/3 of decisions were in discord For AMI and HF, a prevailing pattern is one of “Provider Dolphin catch” No clear pattern for Pneumonia “Provider Dolphin catch” increases as the number of exclusion criteria goes up

    29. Conclusions - 4 Disagreement over the numerator status of a case is less common than over the denominator status (eligibility for an indicator). Also, fewer cases qualify for the numerator

    30. Suggestions - 1 Minimize the number of exclusion criteria Ideally, keep exclusion rules under 3 6SOW AMI indicators 6SOW exclusions 7SOW exclusions AMI-1 Den ASA at admission 13 6 AMI-2 Den ASA at discharge 18 8 AMI-3 Den BB at admission 15 10 AMI-4 Den BB at discharge 17 5 AMI-5 Den ACEI at discharge 18 9 AMI-6 Den Smoking cessation 2 1

    31. Suggestions - 2 Disregard the “Provider Dolphin catch” in calculation of hospital error rate This is a “productive mistake” as opposed to a “counter-productive mistake”

    32. Roadblocks No JCAHO mandate to prove accuracy No QIO funding to train abstractors Plan for hospital-level (not abstractor-level) tracking of accuracy High turnover rate for abstractors

    33. Contact info Andrei Kuznetsov MissouriPRO 573-893-7900, ext. 163 akuznetsov.mopro@sdps.org

More Related