460 likes | 602 Views
Virginia State Police. “Validating NIBRS Data: Some Methods & Procedures” Norman R. Westerberg, Ph.D. Background:. 1975 – Virginia began reporting under the UCR format. 2000 – Virginia transitioned from UCR to IBR (100% NIBRS). Currently: 281 agencies report monthly.
E N D
Virginia State Police “Validating NIBRS Data: Some Methods & Procedures” Norman R. Westerberg, Ph.D.
Background: • 1975 – Virginia began reporting under the UCR format. • 2000 – Virginia transitioned from UCR to IBR (100% NIBRS). • Currently: • 281 agencies report monthly. • 20 vendors operate within Virginia. • Web based. • Reporting is mandated by statute.
Importance of these data Official Crime Statistics • for Virginia, • independent jurisdictions. If these data are incomplete or inaccurate …… conclusions will be wrong.
The General Process Vendor Software: Transforms data to IBR compatible crime data Agency Crime Data internet FBI Data to criminal justice agencies, policymakers, pubic, etc.
If data are to be most useful,we have to assure: Data Quality There are various methods of enhancing data quality.
- Method 1 -Detection and Distribution of Anomalies • We send anomalies to agencies on a regular basis. • Currently we have 25 different anomalies (using the national Program’s list as a basis). • Sent at the end of each reporting quarter. • Sent to the head of the agency. • Sent through “hardcopy.” • We rely on agencies to check & modify their data. • In 2011, we mailed 12,741 anomalies.
An example of an anomaly: Aggravated assault may actually be a simple assault. “These are incidents of aggravated assaults (13A) with ‘none’ or ‘personal’ weapons and ‘none’ or ‘minor’ injury to victims. Most aggravated assaults involve a weapon (other than personal weapons) and usually result in some sort of major injury.”
We really don’t know if:: …. anomalies need to be changed, …. or if they do, agencies change them, …. and change them in a consistent manner.
- Method 2 - Tracking • Currently, we track: • Offenses submitted as “bias motivated” or “hate crimes,” • Property listed as $100K to $Million or more.
General Process of Tracking • Produce list on quarterly basis. • Emailed to IBR contact. • Make sure header contains complete directions. • From responses, indicate if “correct” or “incorrect.”
Process of Tracking continued….. • If correct, no need to resubmit. • If incorrect, record what the correct code should be. • Follow up to make sure incident is resubmitted correctly in next submission. • Re-contact agency if necessary.
Major advantage: A higher quality of data. • Major disadvantage: Resource intensive. Question: Is this necessary?
Bias Motivation(a.k.a. Hate Crimes) Bias Motivation Submissions Total = 220
What are the similarities or patterns? • Errors by same agencies • Discussions with agencies, • Training within agency. • Errors by same vendors • Example: “Anti-Mental Disability” • (of the 82 submitted that should have been ‘None’, 20 were Anti-Mental Disability.”
Why? This one vendor handles 50% of Virginia’s submitting agencies
So how did agencies do? 122 correct (55%); 98 not correct (45%). Of the 98 identified incorrectly as hate crimes, 7 were not changed for end-of-year reporting.
Tracking, continued…$ Property Submissions • Total 238 offenses had values = $100,000 or more… • 172 (72%) were correct; • 66 (28%) were incorrect. • $ Value • Correct: $ 42,772,070 • Incorrect: $124,867,240 If reported correctly: $ 3,361,700
Or • If the state program had done nothing: $167,693,310 would have been reported. • What should have been reported? $46,133,770 • Were all modifications made? • No ……. 9 offenses were not modified. • Result: $1,283,763 over reported.
What have we learned? • No difference in agencies. • No difference in vendor software. • Data entry errors by agency personnel: • $200,000 vs. $20,000 • $120,000 vs. $12,000 • $64,070,611 vs. $64
Inability to check/verify “unfounded” cases. • Recording values for actual item for some offenses: (i.e. Damaged/Vandalized, recording actual value of item rather that the amount of loss).
- Method 3 - Independent Validation ... This involves using non-traditional sources. WHY?
MURDER NONNEGLIGENT MANSLUAGHTER
Independent Source(s) • Medical Examiner • Newspaper • Internet
What are the possible results? Three different outcomes: • Reporting Agency & Medical Examiner have same incident, • Reporting Agency has incident, but Medical Examiner does not, • Medical Examiner has incident, but Reporting Agency does not.
Reporting Agency & Medical Examiner have same incident. 271
2) Reporting agency has incident, but Medical Examiner does not…. Could mean that agency determined that is was a murder/nonnegligent manslaughter when medical examiner determined otherwise. Generally, these should not have been classified as murders. Examples included: 13A - attempted, 90Z - conspiracy to commit (i.e. for hire), 09B - negligent manslaughter, OR Charged with 09A for drug overdose, Case open for investigation only, Counted by more than one agency, MV accident.
3) Medical Examiner has incident, • but agency does not…. • Could mean that agency determined that it was not a murder/nonnegligent manslaughter when medical examiner determined it was. • Generally, these should have been classified as murder/nonnegligent manslaughter. • Examples included: • Code of 13A and died days, week later • ..... agency failed to update. • Error in original submission • ..... Agency failed to resubmit. • Agency “behind” with data entry. • Oversight – (common problem: Not closed).
What was the final result? • 271 … LE & Medical Examiner data matched (i.e. keep). • 30 … LE data should not have been 09A (i.e. subtract). • 19 … where LE data should have been 09A (i.e. add). OR 271 correctly submitted 49 data entry errors. 320 OR 49 / 320 = 15.3% (error)
How successful were we? Of the 49 that needed to be modified…… ………only 1 incident was not modified and resubmitted.
The Result….. 2.16 homicides per 10,000 vs. .81 homicides per 10,000
Implication of these findings ….. • Continued needs: • Training, • Additional Methods to detect errors, • Audits, • Enforcement of sanctions • Etc.