240 likes | 405 Views
Healthcare-Associated Infections (HAIs): Reporting and Validating Data across the Nation. Tyler Whittington. Texas Information. Health and Safety Code, Chapter 98 SSIs related to 10 procedures and CLABSI Required to review reporting activities to ensure data are valid
E N D
Healthcare-Associated Infections (HAIs): Reporting and Validating Data across the Nation Tyler Whittington
Texas Information • Health and Safety Code, Chapter 98 • SSIs related to 10 procedures and CLABSI • Required to review reporting activities to ensure data are valid • Currently: 390 licensed Ambulatory Surgical Centers and 512 licensed hospitals in Texas
Goal of Report • Compile a comprehensive source of state-specific information related to HAI reporting and data validation • Qualitative surveys were chosen as the method to extract this information
Methods • Basic understand of HAI reporting • “First State-Specific Healthcare-Associated Infections Summary Data Report” • Compiled a list of preliminary questions to learn about other state mandates for HAI reporting • Conference calls with subject matter experts • Rachel Stricof • Mary Andrus • Becky Heinsohn and Karen Vallejo
Subject Matter Experts: Take Home Points • Both numerator and denominator data are extremely important for calculating infection rates • It is imperative to understand the processes that facilities use to capture data • Central line days are one of the most inconsistently measured and reported components used to determine SIRs • Despite creating a sample size determination for auditing records the bottom line is how many records can be audited in a given day • Administrative databases can be very valuable to provide monthly checks on data
State-Specific HAI Information • Taken from the CDC: http://www.cdc.gov/hai/stateplans/required-to-report-hai-NHSN.html • Compiled the following information into tables: • Contact information • Reporting status • Reporting indicators • Facilities required to report
Reporting Survey • Do you currently audit HAI reporting in your state? • If you do audit, may we contact you again later to discuss more specifics about this proposed audit methodology? • Do you plan to audit any of the reported HAI data in the next six months? • May we contact you again later to discuss more specifics about this proposed audit methodology? • If you do not plan to audit in the next six months, what led you to this decision? • If you are not auditing due to lack of resources, what resources would you need (e.g. more FTEs, staff with infection prevention experience, travel funds, etc.)? • How many FTEs do you currently have to implement mandatory reporting? • Did your state ever attempt to create a method for auditing facilities to validate reported data?
Reporting Survey Results • Rhode Island, Delaware, West Virginia and Arkansas: all pointed to limited resources as the reason behind the lack of an auditing plan • There is a need for improved funding and increased staff
Auditing Survey • How do you select facilities to be audited? • How do you select procedures to be audited? • How do you select medical records to be audited? • What is the timing cycle of the audit (Do you visit every facility every year, every three years, variable depending on findings)? • What is the ratio of auditors to facilities? • Is the audit performed blinded? • What is the actual number of charts you review per as many units of time as they can tell you (e.g. per visit, per facility, per day, per year…)? • What are the error rates that you have found for specific variables?
Auditing Survey Results • Connecticut, Maryland, and Washington: CLABSIs • New York and South Carolina: CLABSIs and SSIs • NY: SSIs for hip, colon, and CABG procedures • SC: SSIs for hip, knee, CABG, and hysterectomy procedures in all facilities. Colon and abdominal hysterectomies in facilities with <200 beds • Pennsylvania: CLABSIs and CAUTIs
Auditing CLABSIs: Connecticut • One auditor reviews all positive blood cultures from every facility reporting CLABSIs to NHSN (~30) • Blinded audit: all medical record reviews occur from January 1st – April 30th • For 2008 audit: 35-day time period with 770 positive blood cultures and 476 septic events
Auditing CLABSIs: Maryland • Contracted with APIC • Five auditors complete blinded medical record reviews in all facilities reporting CLABSIs to NHSN (~45) • Review 5 charts in ICUs in the top and bottom 11 (25th percentile) facilities of their ranking list (based on CLABSI rates) and 4 charts in all other ICUs • Audit study time period: July 1, 2008 to June 30, 2009 • Charts audits completed from December 9, 2009 – January 8, 2010
Auditing CLABSIs: New York • Complete medical record reviews for at least 90% of the facilities reporting CLABSIs to NHSN (~200) • Facilities with high rates of HAI and low rates of HAI are given priority for their audit • Six auditors divided into regions of the state • 5 auditors each responsible for 35-39 facilities • 1 auditor responsible for 9 facilities in the capital region
Auditing CLABSIs: New York • Each facility submits: • Line list of NHSN CLABSIs • Laboratory list of positive ICU blood cultures • Randomly select patient records from the most recent ICU positive blood cultures • Currently, they select a total of 20 records to audit
Auditing CLABSIs: Pennsylvania • Contracted with APIC: 4 CIC IP auditors • 12 facilities are selected to validate their CLABSI data • 8 records are selected at each facility and auditor is blind to results
Auditing CLABSIs: South Carolina • Two auditors to review records in 60/75 facilities required to report • No specific methodology for selecting records to review • Attempt to review between 20 and 30 records at each facility
Auditing CLABSIs: Washington • Annual internal validation performed by all (62) facilities required to report • External validation done by state health department depending on results • Currently has two auditors to complete external validation • External Validation: • Select 40 records from most recent positive blood culture list • List of all CLABSI records (based on discharge data) within same time period • Review both lists and note if any record meets NHSN definition of CLABSI • Compare the lists to check for discrepancies
Auditing SSIs: New York • Treat each SSI indicator as a separate audit (hip, colon, CABG) • 9-18 medical records selected for each procedure (depending on volume of procedures done) • Case-control format • Cases: taken from NHSN • Controls: • Hip and colon surgeries from NY State Wide Planning and Research Cooperative System (SPARCS) not in NHSN • CABG surgeries from the Cardiac Surgical Reporting System (CSRS) that do not appear in NHSN
Auditing SSIs: South Carolina • Hip, knee, CABG and abdominal hysterectomies in all hospitals and colon surgeries for facilities with less than 200 beds • Stratified random method to select records to audit • Review roughly 20-30 charts per facility • Variable depending on size of facility and volume of procedures performed
Common Error Rates • Connecticut: improper understanding of CLABSI rules • minimum time period, patient transfer, and two or more blood cultures drawn on separate occasion rule
NY and SC: wound class, procedure duration, ASA score, and improper classification of surgeries as clean versus clean-contaminated Common Error Rates
Conclusion • Lessons can be learned from other states and their validation programs • However, Texas has drastically larger numbers of facilities without increased funding • It is vital for Texas to create an efficient validation plan • Think long-term goals with limited funding