1 / 76

Evaluation of Public Health Surveillance Systems

Evaluation of Public Health Surveillance Systems. CDC/CSTE Applied Epidemiology Fellowship Program Orientation 2009 Sam Groseclose, DVM, MPH Division of STD Prevention, NCHHSTP, CCID sgroseclose@cdc.gov Phone: 404-639-6494. Objectives.

RoyLauris
Download Presentation

Evaluation of Public Health Surveillance Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of Public Health Surveillance Systems CDC/CSTE Applied Epidemiology Fellowship Program Orientation 2009 Sam Groseclose, DVM, MPH Division of STD Prevention, NCHHSTP, CCID sgroseclose@cdc.gov Phone: 404-639-6494

  2. Objectives • Review steps in organizing & conducting surveillance system evaluation • Describe surveillance system attributes that should be assessed or measured • Describe how evaluation of surveillance system for outbreak detection differs from one for individual cases

  3. “Public health surveillance is the ongoing, systematic collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality and to improve health.” • CDC. Updated guidelines for evaluating public health surveillance systems. MMWR 2001;50 (No. RR-13)

  4. “Public health surveillance is the ongoing, systematic collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality and to improve health.” • CDC. Updated guidelines for evaluating public health surveillance systems. MMWR 2001;50 (No. RR-13)

  5. Why evaluate a surveillance system? • Are objectives being met? • Is outcome under surveillance still of public health importance? • Is monitoring efficient & effective? • Are objectives still relevant?

  6. When to evaluate a surveillance system? Response to changes in… • Priorities • Information needs • Epidemiology • Diagnostic procedures • Clinical practices • Data sources

  7. Rubella incidence – historic lows 50% rubella infections – asymptomatic Is endemic transmission interrupted in U.S.? Averhoff et al CID 2006

  8. How would you assess adequacy of rubella surveillance?

  9. Evaluation methods: • Survey: state & local health department • Rubella & measles surveillance practices, e.g., # measles outbreak investigations • Survey: state & local public health labs • Lab testing practices • Enhanced evaluation: CA, NYC, US-Mexico Border ID Surveillance Project • Sentinel surveillance: HMO-based

  10. Are measles or rubella investigations occurring? Is confirmatory lab testing being conducted? Averhoff et al CID 2006

  11. Which jurisdictions are conducting rubella investigations? Averhoff et al CID 2006

  12. Conclusions: • No new cases found  Sufficient sensitivity of surveillance. • Rubella surveillance “rides the coattails” • of measles and other rash illness surveillance • enhancing sensitivity. Averhoff et al CID 2006

  13. Evaluation allows proactive response to new demands. • New epidemiologic findings Revision of case definitions? • Data source allows monitoring of additional health-related events? • Need > timeliness or > efficiency?  Use of new information technology • Increasing access to e-data  Protection of patient privacy, data confidentiality, & system security • Other…

  14. CDC’s Updated Guidelines for Evaluating Public Health Surveillance Systems, 2001

  15. CDC’s Updated Guidelines for Evaluating Public Health Surveillance Systems, 2001 Based on: • CDC’s Framework for Program Evaluation in Public Health. MMWR 1999;48(RR-11) – under revision. • CDC’s Guidelines for evaluating surveillance systems. MMWR 1988;37(No. S-5) • Need for integrating surveillance & health information systems • Increasing relevance of informatics: • Establishing data standards • Electronically exchanging health data • Facilitating response to emerging health threats Addressed:

  16. Examples of other guidance on public health surveillance monitoring & evaluation

  17. Tasks in CDC’s updated guidelines • Engage stakeholders • Describe system • Focus evaluation design • Gather evidence of system’s performance • State conclusions & make recommendations • Ensure use of findings & share lessons learned

  18. Task A. Engage stakeholders • Who are the system stakeholders? • Which ones should be involved? • Scope, level, & form of stakeholder involvement will vary • Influence design? • Provide data? • Aid interpretation? • Implement recommendations?

  19. Stakeholder identification & engagement • Ask your supervisor • Who is funding the system? • Who uses information derived from system? • Does political/organizational environment allow them to influence evaluation? How to engage? • Interview – develop questions ahead of time • Survey – more structured, more stakeholders, more relevant if they are ‘active’

  20. Task B. Describe system • Public health importance • Purpose, objectives, & operation • Planned use of data • Case definition • Population under surveillance • Legal authority • System flow chart • Roles & responsibilities • Inputs & outputs • Resources

  21. Public health importance: Should this event be under surveillance? • Indices of frequency or burden • Case count? • Incidence rate? • Summary measures of population health status • Disability-adjusted life-years? • Indices of severity • Case-fatality rate? • Hospitalization rate? • Disparities or inequities associated? • Preventability?

  22. Public health importance: Information sources? • Subject matter experts • Surveillance & research data • Literature review • Other…

  23. Surveillance system purpose Why does the system exist? Example: To monitor “X health condition” in “Population under surveillance”

  24. Surveillance system objectives How are the data to be used for public health action? • Monitor burden or trends • Identify populations at increased risk • Support early detection • Inform risk management & decision-making • Evaluate interventions or policy • Other…

  25. Example: Objectives for Australian National Notifiable Disease Surveillance System (NNDSS) Miller et al. Comm Dis Intell, 2004

  26. Example: Australian NNDSS processes Miller et al. Commun Dis Intell, 2004

  27. Example:Legislative authority: Australian NNDSS • No legislative requirement for states and territories • to send notifiable disease data to the Commonwealth. Miller et al. Commun Dis Intell, 2004

  28. Resources • Direct costs • Person-time per year • IT hardware/software • Travel • Training • Indirect costs • Follow-up diagnostic lab testing • Case management • Outbreak response • Prevention benefits/costs from societal perspective • Cost of missing outbreaks • Productivity losses averted

  29. Example: • Resources • Direct costs only • Cost by system phase Kirkwood et al. J Public Hlth Mngmnt Practice, 2007

  30. Task C. Focus evaluation design • Specific purpose of the evaluation • CSTE fellowship only? • Public health objectives? • Response to health system reform? • Stakeholder’s input (Task A) • Identify questions that will be answered • How will information generated be used? • Can you define ‘relative’ performance standards’ metrics for attributes a priori? • What’s acceptable?

  31. Usefulness? What actions taken based on data from system? Meet system objectives? System attributes Simplicity Flexibility Data quality Acceptability Sensitivity Predictive value positive (PVP/PPV) Representativeness Timeliness Stability Task D. Gather evidence of system’s performance

  32. Task E. State conclusions and make recommendations • Conclusions • Important public health problem? • System’s objectives met? • Recommendations • Modification/continuation? • Consider interdependencies between system costs & attributes • Ethical obligations • Surveillance being conducted responsibly?

  33. Example: Evaluation conclusions Jhung et al. Medical care, 2007

  34. Task F. Ensure use of findings and share lessons learned • Deliberate effort to use results & disseminate findings? • Prior discussion of response to potentially negative findings? • Prior plan to implement recommendations based on findings? • Strategies for communicating findings? • Tailor content & method to relevant audience(s)

  35. “The reason for collecting, analyzing and disseminating information on a disease is to control that disease. Collection and analysis should not be allowed to consume resources if action does not follow.” Foege WH et al. Int J Epidemiology 1976 Similarly, evaluation findings should be applied for surveillance improvement.

  36. Usefulness? What actions taken based on data from system? Meet system objectives? System attributes Simplicity Flexibility Data quality Acceptability Sensitivity Predictive value positive (PVP/PPV) Representativeness Timeliness Stability Task D. Gather evidence of system’s performance

  37. Example: Usefulness from public health system perspective Miller et al. Communicable Disease Intelligence, 2004

  38. Example: Usefulness from external stakeholder perspective Miller et al. Communicable Disease Intelligence, 2004

  39. Have your surveillance efforts resulted in any of these outcomes? WHO/CDS/CSR/LYO/2004.15

  40. Usefulness? What actions taken based on data from system? Meet system objectives? System attributes Simplicity Flexibility Data quality Acceptability Sensitivity Predictive value positive (PVP/PPV) Representativeness Timeliness Stability Task D. Gather evidence of system’s performance

  41. Timeliness • Different scales based on outcome & action • Meningococcal meningitis >> cancer • If timeliness is critical: • Active surveillance • Acquire electronic records • Encourage telephone reports on suspicion • Educate clinicians and lab staff • Review as frequently as the data arrive • Remove barriers to prompt reporting • Adjust investment to importance

  42. When measuring timeliness, specify the types of dates used and the intervals measured. Jajosky RA et al. BMC Public Health 2004

  43. Source: CDC. Framework for evaluating public health surveillance systems for early detection of outbreaks: recommendations from the CDC Working Group. MMWR 2004; 53(No. RR-5).

  44. Assessment of timeliness of web-based notifiable disease reporting system by local health department Conclusion: “relatively complete and timely” Recommended future use of test result date (vs. collection) Vogt et al. J Public Health Management Practice 2006

  45. Sensitivity & PVP of system

  46. Sensitivity • Affected by: • Case detection process • Case reporting process • Sometimes referred to as completeness of reporting If reporting is ‘representative’ & consistent, surveillance system may perform well with moderate sensitivity

  47. Sensitivity for individual cases • High sensitivity means you miss few cases • To improve sensitivity: • Broaden case definition • Encourage reporting on suspicion • Active surveillance • Acquire electronic records • Audit sources for completeness • Remove barriers • Adjust investment to importance • Tradeoff with positive predictive value

More Related