340 likes | 365 Views
CVSA and FMCSA. SSDQ Performance Measures: Crash Timeliness Inspection Timeliness. April 21--22, 2013 CVSA Spring Workshop. Introduction. Candy Brown Presenter, Timeliness Performance Measures and Reports Kevin Berry Presenter, Improvement Strategies. Agenda.
E N D
CVSA and FMCSA SSDQ Performance Measures: Crash Timeliness Inspection Timeliness April 21--22, 2013 CVSA Spring Workshop
Introduction Candy Brown Presenter, Timeliness Performance Measures and Reports Kevin Berry Presenter, Improvement Strategies
Agenda • Overview of Timeliness Performance Measures • Why Timeliness Matters • Training Objectives and Expected Outcomes • How State Ratings Are Determined • How to Interpret Data Quality Reports • When and How to Improve Data Quality
Overview of Timeliness Performance Measures • Crash Timeliness: the percentage of fatal and non-fatal crash records submitted to Motor Carrier Management Information System (MCMIS) within 90 days of the crash event over a 12-month period • Inspection Timeliness: the percentage of inspection records submitted to MCMIS within 21 days of the inspection event over a 12-month period
State Safety Data Quality (SSDQ) Measures Driver IdentificationEvaluation Vehicle IdentificationEvaluation Inspection RecordCompleteness VIN Accuracy Timeliness Accuracy Driver IdentificationEvaluation Vehicle IdentificationEvaluation Crash Consistency (Overriding Indicator) Record Completeness Non-FatalCompleteness FatalCompleteness Timeliness Accuracy Overall State Rating
Why Timeliness Matters The Safety Measurement System (SMS) Behavior Analysis and Safety Improvement Categories (BASICs) need timely data to target the right carriers for safety performance interventions. • The SMS weighs recent events more heavily; missing events mean that carriers’ SMS BASICs could be better or worse than they should be • Carriers may not be targeted for investigations • Inconsistent timeliness among States could skew the SMS results in favor of carriers operating in States that report late SMS Weights Crashes by Time Event Date Weight 3 Weight 2 Weight 1 Most recent 6 months 7-12 months 12-24 months
Training Objectives • Explain the Crash and Inspection Timeliness performance measures • Explore Timeliness reports • Show how data collection and processing errors can affect Timeliness ratings • Identify FMCSA resources for improving data quality
Expected Outcomes • Understand Timeliness performance measure methodology • Interpret Timeliness rating results • Interpret Timeliness State Data Analysis Reports (SDAR) and custom reports/SAFETYNET Queries • Identify potential sources of collection and reporting issues • Identify FMCSA resources for improving data quality
How State Ratings Are Determined Training
Methodology Crash Timeliness • Determines a crash rating (Good, Fair, Poor) based on the percent of records reported to MCMIS within 90 days of the crash event • 12-month time span • Evaluates fatal and non-fatal crash records Inspection Timeliness • Determines an inspection rating (Good, Fair, Poor) based on the percent of inspection records reported to MCMIS within 21 days of the inspection event • 12-month time span • Evaluates inspection records
Evaluation Period = Event Date Range 12 Months of MCMIS Data • Based on event date, not upload date • “Rolling” 12-month period • Excludes the most recent 3 months
Ratings • Timeliness ratings calculated every month • Results posted on the A&I Data Quality Website Number of Records Reported On Time Number of Total Records Evaluated = Percent of Timely Records
How to Use Data Quality Reports Three types of reports: 1 Rating Results 2 State Data 3 Custom Reports & Analysis Reports SAFETYNET Queries What you can do with them: Spot trends in reporting Identify how late records are when reported Monitor upload frequency to MCMIS Identify late records by jurisdiction
Monthly Rating Results Event Date Range is 1/1/2012 – 12/31/2012 12 Months of MCMIS data Not Actual Data MCMIS snapshot was taken March 22, 2013
Crash Timeliness Ratings How to Interpret • Report displays the last 13 ratings in a bar chart and a table • Each rating based on the percentage of timely records in MCMIS • Compares current and previous results to identify trends When to Act • Unusual or significant change in percent or number of timely records • Slow decline in rating • Even when the rating is Good
State Data Analysis Reports (SDAR) How to Interpret • Details of evaluation period • Timeliness of records by month of event • Trends in timeliness and counts When to Act • Downward trend in timeliness • Change in counts between months Crash Record Timeliness − Monthly Analysis
SDAR (cont.) Crash Record Timeliness − Number of Days Between Record Uploads to MCMIS How to Interpret • Three days or more between uploads to MCMIS • Trends in timeliness and volume When to Act • Frequent instances of uploads taking place over three days apart • Significant change in volume
SDAR (cont.) Inspection Timeliness − Records Reported by Inspector How to Interpret • Sort by: • Inspector ID • Number or percentage of on-time and late records • Total evaluated records When to Act • Inspectors with high numbers/ percentage of late records • Widespread distribution of late records Insp #’s hidden from view
Custom Reports and SAFETYNET Queries • Explore specific data quality issues: • Crash event date by upload date • The timeliness of records in each upload to MCMIS • When there are late records or change in counts • Comparison of timeliness by agency • Create a SAFETYNET query, including: • List of records input late to SAFETYNET
Custom Reports Crash Event Date by Upload Date Event Dates
Custom Reports Comparison of Timeliness by Agency
SAFETYNET Queries Days from Event Date to SAFETYNET
When and How to Improve Data Quality Training
Data Collection and Reporting Process Key to improving Crash and Inspection Timeliness: Understand your State’s collection and reporting process. All Crashes Meeting FMCSA Criteria Must Be Reported On Time Collect Select Report Law Enforcement State Organization MCSAP Office
Collect and Transfer Crash Reports Promptly Possible Actions by Law Enforcement • Ensure officers understand the importance of timeliness to FMCSA • Formal training • Feedback to individual officers and/or agencies • Ensure reports are transferred promptly • Prioritize FMCSA crash reports • Pay attention to system updates or changes in electronic collection Review/ Correct Transfer Collect Data at Scene
Process and Transfer Crash Reports Promptly Possible Actions at State Crash Repository • Assess applicable procedures to ensure records are reviewed and promptly transferred to the MCSAP Office • Prioritize FMCSA crash reports for processing • Prioritize FMCSA crash reports for transfer to MCSAP Office • Track crash reports sent back to officer for correction • Pay attention to system updates or changes in electronic transfer to ensure records are not delayed Transfer Receive Review/ Input ID FMCSA Reportables Forward Report
Process and Upload Crash Reports Promptly Review/ Input to SAFETYNET ID FMCSA Reportables Receive Forward Report Upload to MCMIS • Possible Actions in the MCSAP Office • Identify and implement improvements for processing and uploading reports • Validate number of reports received from the State crash repository • Track crash reports sent back to officer for correction • Consider SMS weightings and Timeliness cut-offs when prioritizing backlogs • Address backlogs by adding/reassigning staff • Upload to MCMIS daily • Check activity logs daily for rejected records
What to Do Next Interagency Coordination: How Does It Work in Your State? State Police State Crash Agency Local Law Enforcement Agencies MCSAP Office Other State Agencies
Contacts Candy Brown SSDQ Measure Development and Analysis Candace.Brown@dot.gov 617-494-3856 Kevin Berry Technical Analyst Kevin.Berry@dot.gov 617-494-2857
Training Recap I am now able to: • Understand Timeliness performance measure methodology • Interpret Timeliness rating results • Interpret Timeliness SDAR • Identify potential sources of collection and reporting issues • Identify FMCSA resources for improving data quality