1 / 45

Virginia State Police

Virginia State Police. “Validating NIBRS Data: Some Methods & Procedures” Norman R. Westerberg, Ph.D. Background:. 1975 – Virginia began reporting under the UCR format. 2000 – Virginia transitioned from UCR to IBR (100% NIBRS). Currently: 281 agencies report monthly.

questa
Download Presentation

Virginia State Police

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Virginia State Police “Validating NIBRS Data: Some Methods & Procedures” Norman R. Westerberg, Ph.D.

  2. Background: • 1975 – Virginia began reporting under the UCR format. • 2000 – Virginia transitioned from UCR to IBR (100% NIBRS). • Currently: • 281 agencies report monthly. • 20 vendors operate within Virginia. • Web based. • Reporting is mandated by statute.

  3. Importance of these data Official Crime Statistics • for Virginia, • independent jurisdictions. If these data are incomplete or inaccurate …… conclusions will be wrong.

  4. The General Process Vendor Software: Transforms data to IBR compatible crime data Agency Crime Data internet FBI Data to criminal justice agencies, policymakers, pubic, etc.

  5. If data are to be most useful,we have to assure: Data Quality There are various methods of enhancing data quality.

  6. - Method 1 -Detection and Distribution of Anomalies • We send anomalies to agencies on a regular basis. • Currently we have 25 different anomalies (using the national Program’s list as a basis). • Sent at the end of each reporting quarter. • Sent to the head of the agency. • Sent through “hardcopy.” • We rely on agencies to check & modify their data. • In 2011, we mailed 12,741 anomalies.

  7. An example of an anomaly: Aggravated assault may actually be a simple assault. “These are incidents of aggravated assaults (13A) with ‘none’ or ‘personal’ weapons and ‘none’ or ‘minor’ injury to victims. Most aggravated assaults involve a weapon (other than personal weapons) and usually result in some sort of major injury.”

  8. How Effective is this Method?

  9. We really don’t know if:: …. anomalies need to be changed, …. or if they do, agencies change them, …. and change them in a consistent manner.

  10. - Method 2 - Tracking • Currently, we track: • Offenses submitted as “bias motivated” or “hate crimes,” • Property listed as $100K to $Million or more.

  11. General Process of Tracking • Produce list on quarterly basis. • Emailed to IBR contact. • Make sure header contains complete directions. • From responses, indicate if “correct” or “incorrect.”

  12. Process of Tracking continued….. • If correct, no need to resubmit. • If incorrect, record what the correct code should be. • Follow up to make sure incident is resubmitted correctly in next submission. • Re-contact agency if necessary.

  13. Major advantage: A higher quality of data. • Major disadvantage: Resource intensive. Question: Is this necessary?

  14. Bias Motivation(a.k.a. Hate Crimes) Bias Motivation Submissions Total = 220

  15. 988

  16. 82

  17. What are the similarities or patterns? • Errors by same agencies • Discussions with agencies, • Training within agency. • Errors by same vendors • Example: “Anti-Mental Disability” • (of the 82 submitted that should have been ‘None’, 20 were Anti-Mental Disability.”

  18. Why? This one vendor handles 50% of Virginia’s submitting agencies

  19. So how did agencies do? 122 correct (55%); 98 not correct (45%). Of the 98 identified incorrectly as hate crimes, 7 were not changed for end-of-year reporting.

  20. Number of Hate Crimes reported by year

  21. Tracking, continued…$ Property Submissions • Total 238 offenses had values = $100,000 or more… • 172 (72%) were correct; • 66 (28%) were incorrect. • $ Value • Correct: $ 42,772,070 • Incorrect: $124,867,240 If reported correctly: $ 3,361,700

  22. Or • If the state program had done nothing: $167,693,310 would have been reported. • What should have been reported? $46,133,770 • Were all modifications made? • No ……. 9 offenses were not modified. • Result: $1,283,763 over reported.

  23. What have we learned? • No difference in agencies. • No difference in vendor software. • Data entry errors by agency personnel: • $200,000 vs. $20,000 • $120,000 vs. $12,000 • $64,070,611 vs. $64

  24. Inability to check/verify “unfounded” cases. • Recording values for actual item for some offenses: (i.e. Damaged/Vandalized, recording actual value of item rather that the amount of loss).

  25. - Method 3 - Independent Validation ... This involves using non-traditional sources. WHY?

  26. MURDER NONNEGLIGENT MANSLUAGHTER

  27. Independent Source(s) • Medical Examiner • Newspaper • Internet

  28. What are the possible results? Three different outcomes: • Reporting Agency & Medical Examiner have same incident, • Reporting Agency has incident, but Medical Examiner does not, • Medical Examiner has incident, but Reporting Agency does not.

  29. Reporting Agency & Medical Examiner have same incident. 271

  30. 2) Reporting agency has incident, but Medical Examiner does not…. Could mean that agency determined that is was a murder/nonnegligent manslaughter when medical examiner determined otherwise. Generally, these should not have been classified as murders. Examples included: 13A - attempted, 90Z - conspiracy to commit (i.e. for hire), 09B - negligent manslaughter, OR Charged with 09A for drug overdose, Case open for investigation only, Counted by more than one agency, MV accident.

  31. 3) Medical Examiner has incident, • but agency does not…. • Could mean that agency determined that it was not a murder/nonnegligent manslaughter when medical examiner determined it was. • Generally, these should have been classified as murder/nonnegligent manslaughter. • Examples included: • Code of 13A and died days, week later • ..... agency failed to update. • Error in original submission • ..... Agency failed to resubmit. • Agency “behind” with data entry. • Oversight – (common problem: Not closed).

  32. What was the final result? • 271 … LE & Medical Examiner data matched (i.e. keep). • 30 … LE data should not have been 09A (i.e. subtract). • 19 … where LE data should have been 09A (i.e. add). OR 271 correctly submitted 49 data entry errors. 320 OR 49 / 320 = 15.3% (error)

  33. How successful were we? Of the 49 that needed to be modified…… ………only 1 incident was not modified and resubmitted.

  34. Is this the end to the story?

  35. The Result….. 2.16 homicides per 10,000 vs. .81 homicides per 10,000

  36. Implication of these findings ….. • Continued needs: • Training, • Additional Methods to detect errors, • Audits, • Enforcement of sanctions • Etc.

  37. Questions?

More Related