140 likes | 317 Views
Case Studies of the Real-time Outbreak and Disease Surveillance (RODS) System and the National Retail Data Monitor. DRAFT: CONFIDENTIAL RODS LAB DO NOT DISTRIBUTE. Insert graphic from outbreak. Case Study #X Outbreak/Event Name Outbreak/Event Year Author1, Author2, Author3,…. Date.
E N D
Case Studies of the Real-time Outbreak and Disease Surveillance (RODS) System and the National Retail Data Monitor DRAFT: CONFIDENTIAL RODS LAB DO NOT DISTRIBUTE Insert graphic from outbreak Case Study #X Outbreak/Event Name Outbreak/Event Year Author1, Author2, Author3,…. Date RODS System and NRDM development supported by Pennsylvania Bioinformatics grant ME-107, the Agency for Healthcare Research and Quality, the Defense Advanced Research Projects Agency, the National Science Foundation, Department of Homeland Security, the Alfred P. Sloan Foundation, National Library of Medicine, state departments of health in Indiana, Nevada, New York, Ohio, Utah, Washington State, and the Passaic Valley Water Commission.
DRAFT: CONFIDENTIAL RODS LAB DO NOT DISTRIBUTE Executive Summary • <START DATE> • In OTC monitoring, • In ED chief complaint monitoring, • Routine public health surveillance, • Detection algorithms sounded alerts from X signal on ___ (N.B. detection algorithms with alerting do not run at present on OTC data) • Retrospective studies showed that X algorithm would have had the earliest detection for this outbreak, detecting it on <insert date> with a false alarm every eight weeks. • The X algorithm would have the best detection with a false alarm every eight weeks. • The data source with the strongest signal is <xxx> and had <xx> alarms. • The data source with the earliest signal was <xxx>, and it signaled on <Date>
DRAFT: CONFIDENTIAL RODS LAB DO NOT DISTRIBUTE Contents • Outbreak description • Chronology of outbreak and surveillance events • Epi curves • Which data should we monitor? • Dates of alerts from detection algorithms • Which detection algorithm works best? • “Cost” and effort of investigation • Implications for surveillance
DRAFT: CONFIDENTIAL RODS LAB DO NOT DISTRIBUTE I. Outbreak Description (if this report is not about a false alarm: organism, source, size, demographics…, attach Epi curves if available) • Location: <state, county> • Organism: <xxx> • Source (if relevant): <xxx> • Estimated size: <xxx> • Demographics: <xxx> • Dates:Beginning: <xxx> End: <xxx> • Epi curves (See Section III. Epi Curves):
DRAFT: CONFIDENTIAL RODS LAB DO NOT DISTRIBUTE II. Chronology of Outbreak and Surveillance Events (what you noted first, second, what you did--with dates)
DRAFT: CONFIDENTIAL RODS LAB DO NOT DISTRIBUTE III. Epi Curves INDEX FOR NEXT SLIDES
DRAFT: CONFIDENTIAL RODS LAB DO NOT DISTRIBUTE IV. What Data Should we Monitor? (The following slides should show plots that have the strongest and earliest signal) INDEX FOR NEXT SLIDES • <SLIDE TITLE> <DATE>
DRAFT: CONFIDENTIAL RODS LAB DO NOT DISTRIBUTE V. Dates of Alerts from Detection Algorithms (Attach spatial scans, maps, describe any statistical detection levels—i.e., p values, thresholds--that you used or reacted to)
V. Dates of Alerts from Detection Algorithms (Attach spatial scans, maps, describe any statistical detection levels—i.e., p values, thresholds--that you used or reacted to)
V. Dates of Alerts from Detection Algorithms • <DATES><DESCRIPTION> (Attach spatial scans, maps, describe any statistical detection levels—i.e., p values, thresholds--that you used or reacted to)
DRAFT: CONFIDENTIAL RODS LAB DO NOT DISTRIBUTE VI. Which detection algorithm works best? • <title of first slide on retrospective studies> • <title of second slide on retrospective studies> • … (Do not worry about this section unless you have done such studies. It will be completed by RODS Lab)
DRAFT: CONFIDENTIAL RODS LAB DO NOT DISTRIBUTE VII. “Cost” and Effort of Investigation (description of what you did—phone calls to EDs, to NRDM with estimated time of effort—based on OTC/ED data or alerts)
DRAFT: CONFIDENTIAL RODS LAB DO NOT DISTRIBUTE VIII. Implications for Surveillance