1 / 37

Measuring the Performance of Criminal History Records Systems: The Records Quality Index

Measuring the Performance of Criminal History Records Systems: The Records Quality Index. SEARCH Membership Meeting Washington, DC July 26, 2005 by Gerard Ramker Bureau of Justice Statistics James Tien, Michael Cahn, Robin Neray Structured Decisions Corporation. Outline.

iona
Download Presentation

Measuring the Performance of Criminal History Records Systems: The Records Quality Index

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring the Performance ofCriminal History Records Systems:The Records Quality Index SEARCH Membership Meeting Washington, DC July 26, 2005 by Gerard Ramker Bureau of Justice Statistics James Tien, Michael Cahn, Robin Neray Structured Decisions Corporation

  2. Outline • Structured Decisions Corporation • RQI overview • Current RQI findings • Status of RQI research • Next RQI steps • Bureau of Justice Statistics • BJS performance measurement • NCHIP application process • Use of RQI by States • Open Discussion

  3. RQI Overview - Purpose • Since 1992, SDC has in collaboration with the states been the national evaluator of Federally-funded criminal history records improvement programs • BJS asked SDC to develop a Records Quality Index (RQI) based on a small set of well-defined outcome and process measures to gauge the quality performance of criminal history records systems • Assess the progress of records quality at both the state and national levels • Pinpoint deficiencies and identify appropriate records improvement activities • Assist BJS to target very specific problems and deficiencies in future NCHIP funding cycles

  4. RQI Overview – Measures Criteria (1 of 2) • Reflect chronological progress toward meeting the six common federal program goals identified in SDC’s evaluation: • Provide required resources • Improve records quality • Improve reporting • Automate systems • Identify ineligible firearm purchasers • Identify disqualified civilian applicants • Consider as many of the following data quality attributes as possible: 1. Understandable 5. Consistent 9. Accurate 2. Measurable 6. Reliable 10. Independent 3. Available 7. Valid 11. Complete (as a set) 4. Robust 8. Stable

  5. RQI Overview – Measures Criteria (2 of 2) • Request basic (i.e., raw) data from states; compute derived statistics (e.g., percentages, averages) from basic data • Choose measures to facilitate regular collection of basic underlying data • Lower bound is zero; no upper bound—like the Dow Jones Index, the RQI should be able to grow without limit

  6. RQI Overview - Structure (1 of 4) • The RQI for state s is given by: RQI(s) = K*O(s)*P2(s)/P1(s) • O(s) is the weighted sum of N outcome measures O(s) = [a1O1(s) + a2O2(s) +…+aNON(s)] 0 ≤Oi(s) ≤1, i = 1,2,…,N Currently, N = 10 and ai = 1/N O(s) is detailed on slides 14-15

  7. RQI Overview - Structure (2 of 4) • P1(s) is a normalized and censored process measure that reflects the average elapsed time between arrest and rendered final disposition for all arrests that can be linked to at least one disposition for state s • P1(s) > 0 • P1(s) has no upper limit; thus RQI(s) is unbounded • P1(s) is detailed on slides 16-20 • P2(s) reflects the ‘cohort’ completeness of records (e.g., the fraction of 1991 arrests that have posted final dispositions by the end of 1993) • P2(s) is detailed on slides 21-22 • K = appropriate scaling factor = 100,000

  8. RQI Overview - Structure (3 of 4) • National RQI (NRQI) is weighted average of RQI(s) for 50 states plus District of Columbia, Guam and Puerto Rico NRQI = w(1)*RQI(1) + w(2)*RQI(2) +…+w(53)*RQI(53) w(s) = weight of state s = ratio of number of criminal history records in state s to number of records nationwide currently, total s = 53

  9. RQI Overview – Structure (4 of 4) Example of how to compute the NRQI: Assume that the ‘nation’ consists only of six states; i.e., s = {A, B, C, D, E, F} and that the states’ RQIs, the numbers of records (in thousands), and the weighting factors, are as follows: Then, a six-state NRQI would be given by: NRQI = 0.195*61.9 + 0.096*49.8 + 0.239*40.4 + 0.145*54.5 + 0.111*100.2 + 0.215*92.7 = 65.5

  10. RQI(s) Sensitivity to Changes in O(s), P1(s) and P2(s) (1 of 2)

  11. RQI(s) Sensitivity to Changes in O(s), P1(s) and P2(s) (2 of 2) • Since RQI(s) is directly proportional to O(s) and P2(s), a 20% increase in O(s) or P2(s) yields a 20% increase in RQI(s) • Since RQI(s) is inversely proportional to P1(s), a 20% decrease in P1(s) yields a 25% increase in RQI(s)

  12. RQI Measures Selection ( 1 of 2) • Extensive literature review • Virtually all published public-sector performance measures are budget-focused • Most valuable sources turned out to be: • SEARCH • REJIS • BJS (National Judicial Reporting Program) • NCSC (Court Statistics Project)

  13. RQI Measures Selection (2 of 2) • Most available measures are either not basic or not sufficiently pertinent • Working closely with pilot test states and BJS, SDC defined initial set of outcome measures and refined them several times • Ultimately, SDC and BJS agreed upon 10 outcome measures (slides 14-15) • New outcome measures can be added (slide 30) • Scoring schema can be modified

  14. RQI Outcome Measures (1 of 2)

  15. RQI Outcome Measures (2 of 2)

  16. RQI Process Measures, P1(s) –Details (1 of 5) • Base average time on all arrests occurring in a three-year period of time whose final dispositions are posted to the CCH in the same three-year period • P1(s) is an arrest-based process measure • P1(s) for year N reflects arrests occurring in the three years N-2, N-1, and N • e.g., P1(s) for 1997 = average elapsed time between arrest and rendered (and posted) final disposition for arrests occurring in 1995, 1996 and 1997 • Not all states can separate felonies from misdemeanors • Calendar years are used for state-to-state consistency

  17. RQI Process Measures, P1(s) –Details (2 of 5) • Censor data at end of three-year period • For period-to-period consistency, treat an arrest whose disposition date either (i) occurred after the three-year period or (ii) is not in the data as though the disposition occurred on the last day of the three-year period • Data censoring is an appropriate statistical approach

  18. RQI Process Measures, P1(s)–Details (3 of 5) Example: P1(s) for 1997

  19. RQI Process Measures, P1(s)–Details (4 of 5) • Normalize average elapsed time • Pilot state CCH data demonstrate steady growth in arrest volumes over time; high volumes lead to court congestion which adversely impacts raw times from arrest to disposition • Normalize (i.e., divide) average times by a volume-related factor to control for changes in arrest volume over time • Normalizing factor = ratio of the arrest volume in the measurement 3-year period to the arrest volume in the base 3-year period (i.e., 1991-1993) • Cannot use the volume in the measurement period alone as the factor—it would unfairly penalize the small states • Example based on a State A’s CCH data: • 1997 average arrest to disposition time = 155 days • 1997 arrest volume = 41,982 • 1993 arrest volume = 27,983 • Normalizing factor = 41,982/27,983 = 1.5 • For 1997, P1(A) = 155/1.5 = 103.3 days

  20. RQI Process Measures, P1(s) –Details (5 of 5) • SDC would like to measure average elapsed time from arrest to posting in CCH • Current reality • Reliable basic data on which to base time from disposition to posting not generally available • Pilot state CCHs do not capture time of posting or time of receipt of disposition • In future • Can use disposition date as proxy for disposition posting date in states which have automated interface between court and repository • In other states, can use model-based approach (e.g., regression, simulation) to predict elapsed time from arrest to disposition posting

  21. RQI Process Measures, P2(s) –Details (1 of 2) • P2(s) is a disposition-based process measure • P2(s) for year N reflects the fraction of calendar year N-2 arrest records that have posted final dispositions by the end of calendar year N • e.g., P2(s) for 1997 = the fraction of 1995 arrests that have final dispositions posted at the repository by the end of 1997 • Thus, P2(s) reflects the ‘cohort’ completeness of records

  22. RQI Process Measures, P2(s) – Details (2 of 2) Example: P2(s) for 1997

  23. RQI Process Measures –Required Data • State CCH records extract • All arrests from 1991-2001 • If no automation on 1/1/91, initiate extract at date of automation • Two fields only • Date of arrest • Date of final disposition, if available • Can accept charge record extract in lieu of arrest records (e.g., Delaware, Virginia) • No identifying data needed or requested

  24. Current RQI Findings – States’ Participation: Cycle 1

  25. Current RQI Findings ( 1 of 3) • RQI shown to be effective gauge of performance of state criminal history records systems • RQIs increased over time, as expected; results range from:

  26. Current RQI Findings ( 2 of 3) • Annualized percent RQI increase: • For 1993-1997 = 15.7% • For 1997-2001 = 15.3% • States with RQI = 0 • 1993: 11 states • 1997: 2 states • 2001: 0 states • For 1993-2001, annualized NRQI (national RQI) increase = 17.4%; this is direct result of increases in individual state RQIs due to federally-funded criminal history records improvements

  27. Current RQI Findings: Individual State Results (3 of 3) Note: Detailed footnotes appear in RQI Technical Report

  28. Current RQI Findings – Web Posting www.sdcorp.net/RQI/cycle1results.jsp

  29. Current RQI Findings – State Summaries Sample Summary - Georgia

  30. Status of RQI Research: Cycle 2 Enhancements (1 of 2) • New outcome measure – O11(s) • O11(s) will be appropriately integrated in RQI(s)

  31. Status of RQI Research: Cycle 2 Enhancements (2 of 2) • Filling in the time series gaps: requesting data for 1991-1992, 1994-1996, 1998-2000, 2002-2003 • Asking for CCH extract from 1/1/95 through date of snapshot – may enable us to statistically estimate (otherwise unavailable) arrest and disposition posting times • Web site • Improved look and feel • Updated Director’s Letter, enhanced FAQ, pop-up help, error checking • Printable PDF version of RQI data form for those wanting hard copy

  32. Status of RQI Research – ‘Cycle 2’ Data Collection Status as of July 25, 2005:

  33. Next RQI Steps (1 of 4) • Continue model-based efforts – e.g., simulating records creation process; estimating disposition posting delays • Develop State-Specific RQI Optimizer (SSRO) to assist states and BJS to identify federally-fundable activities which yield greatest improvement in RQI

  34. Monitor and Assess Federally-Funded State Activities Model RQI Operationalize Annual Submission of RQI Data Calculate State and National RQIs for 2003-2006 Develop State-Specific RQI Optimizer (SSRO) Next Steps (2 of 4)

  35. Next RQI Steps (3 of 4) • SSRO Approach • A unique interactive, descriptive,Web-based tool allowing the states and BJS to perform a series of ‘what-if’ analyses • With each ‘production run’, the RQI is automatically recalculated reflecting the proposed records improvement activities

  36. Next RQI Steps (4 of 4) • In developing SSRO, use SDC database of records improvement activities to link activities to changes in RQI by mapping state activities to respective underlying RQI measures • In 1992, SDC designed a scaleable activity classification scheme to categorize federally-funded records improvement activities • Since then, SDC has collected, reviewed, stored, monitored and analyzed over 3,700 activities in 19 categories • SDC is currently analyzing statistical relationships between records improvement activities and improvements in RQI measures

  37. Contact Information Robin Neray; Michael Cahn Structured Decisions Corporation 1105 Washington Street, Suite 1 West Newton, MA 02465 (617) 244-1662 neray@sdcorp.net; cahn@sdcorp.net RQI Cycle 1 results and RQI technical report can be found at: www.sdcorp.net/RQI/cycle1results.jsp

More Related