260 likes | 403 Views
Data Quality Toolbox for Registrars. MCSS Workshop December 9, 2003 Elaine Collins. Quality Data Toolbox. Artisan Registrar Medium Computerized data Raw Materials Medical information Shaping tools Knowledge, skills Directions Standards Measuring tools Editing “tools”
E N D
Data Quality Toolbox for Registrars MCSS Workshop December 9, 2003 Elaine Collins
Quality Data Toolbox • Artisan Registrar • Medium Computerized data • Raw Materials Medical information • Shaping tools Knowledge, skills • Directions Standards • Measuring tools Editing “tools” • Final Product Cancer record • Goodness Match to standards
Quality Data - Goodness • Accurate • Consistent • Complete • Timely • Maintain shape across transformation and transmission
Measuring Tools • Reabstracting studies • Structured queries and visual review • Text editing • EDITS • MCSS routine review
Exercises • MCSS reabstracting study – 2003 • Sites: Breast, Corpus uteri, Lung, Melanoma, Testis, Soft tissue sarcoma • 2000 diagnosis year • 12 facilities • Review of reported data – Structured query • Review of reported data – Text editing
Reabstracting Studies • Compares original medical record with reported cancer record • Considered the “gold standard” • Labor-intensive; all records used at initial abstracting may not be available; biased by reabstractor’s training and skills
Structured Queries • Compares coding across series of records sorted by selected characteristics • Useful for finding pattern discrepancies across many records • Manual process; some comparisons may be converted to automated edits
Text Editing • Compares text with coded values for individual records • Useful for immediately identifying coding problems • Manual process; most effective on completion of each individual case
EDITS • Checks range validity for many fields, comparability of few fields for individual records • Automated process, can be applied on completion of each record or on preparation of batch report; warnings and over-rides are alternatives to failures • Expansion of interfield edits requires careful logic
Edits Analysis • Edits to be included in MCSS Set • Edits in Hospital/Staging Edit Sets – C edits are included in confidential data set • No Text Edits displayed • Criteria • Valid codes/dates • Alpha/numeric • Timing • Interfield comparisons • Absolute conditions
MCSS Review • Requests values for missing or unknown data; resolves conflicts between data items from multiple facilities and between data items updated by single facility • Allows incorporation of information from multiple facilities • Review for limited number of conditions
Cancer Registrar – Resource for Quality Data ICD-O Medical Record Physician COC Patient AJCC Facility System SEER Other Registries Registrar NAACCR Facility Staff Committees Central Registry Protocols Quality Monitors NCDB Cancer Control CDC Cancer Research Public NAACCR
Data Inputs • Patient data from facility systems • Medical record reports and notes • Pathology reports • Staging forms • Communication with physician offices • Communication with other registries • Communication with patients
Process Inputs • Registrar training, knowledge, skills • Coding standards – ICD-O-3, COC, AJCC, SEER, NAACCR • Interpretations of standards – I&R, SEER Inquiry, Ask NAACCR • Medical literature – printed and online • Registry software data implementations
Sources of Error • Patient data from facility systems • Medical record reports and notes • Pathology reports • Staging forms • Communication with physician offices • Communication with other registries • Communication with patients
Sources of Error • Registrar training, knowledge, skills • Coding standards – ICD-O-3, COC, AJCC, SEER, NAACCR • Interpretations of standards – I&R, SEER Inquiry, Ask NAACCR • Medical literature – printed and online • Registry software data implementations
Types of Errors • Missing/conflicting data • Shared data errors • Timing/coding errors • Standards and interpretations – ambiguities, omissions, confusions, contradictions • Discrepancies among local/central registry practice and national standards
Software Implementations • Discrepancies between implementations and national standards • Lack of registrar knowledge/training on correspondence between registry and exported data • Logic errors in matching registry data to reporting formats • Conversion errors
AJCC Staging Dilemma • Are pathologic nodes required for pathologic stage grouping? • How do Minnesota registrars answer this question?
Collaborative Staging • Provides specific rules for coding known vs unknown staging elements • Accommodates “best” stage for AJCC stage assignment
AHIMA 75th Annual ConferenceOctober, 2003 Minneapolis:Coming Events • Data mining • ICD-10-CM • SNOMED • Natural language processing
AHIMA 75th Annual ConferenceOctober, 2003 Minneapolis: Challenges • What is our professional purpose? • How do we envision ourselves as professionals?
Foundation for Quality Data • Registrar’s commitment to registry purpose • Registrar’s knowledge, understanding of cancer data • Registrar’s management of communication technologies • Registrar’s advocacy for data use
SUMMARY • Consistent recording and reporting of quality cancer data requires commitment. • Routine and regular review of data patterns facilitates data knowledge and quality. • Passing EDITS assists but does not ensure data quality. • Data standards change, use the manuals. • Welcome Collaborative Stage.