1 / 41

Evaluation of surveillance systems

Evaluation of surveillance systems. Preben Aavitsland. Surveillance. Surveillance is the ongoing systematic collection, collation, analysis and interpretation of data; and the dissemination of information (to those who need to know) in order that action may be taken

gage
Download Presentation

Evaluation of surveillance systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of surveillance systems Preben Aavitsland

  2. Surveillance Surveillance is the ongoing systematic collection, collation, analysis and interpretation of data; and the dissemination of information (to those who need to know) in order that action may be taken Information for action!

  3. The surveillance loop Health care system Surveillance centre Data Information Event Action Reporting Analysis, interpretation Feedback, recommendations

  4. Importance of evaluation • Quality • Often neglected • Basis for improvements • Obligation • Does the system deliver? • Credibility of public health service • Learning process • EPIET training objective • ”Do not create one until you have evaluated one”

  5. General framework • A. Engagement of stakeholders • B. Evaluation objective • C. System description • D. System performance • E. Conclusions and recommendations • F. Communication

  6. A. Engagement of stakeholders

  7. Stakeholders • The ”owners” and the ”customers” • Users of surveillance system information • Public health workers • Government • Data providers • Clinicians • etc. • Steering group? • A condition for change

  8. B. Evaluation objective

  9. Objective and methods • Specific purpose • Scope of evaluation • Methods • Document studies • Interviews • Direct observations • Special studies

  10. C. System description

  11. C. System description • 1 Public health rationale (why?) • 2 Objectives (what?) • 3 Operations (how?) • 4 Resources (how much?) • Extreme learning value!!!!

  12. The disease Severity Frequency Communicability International obligations Costs Preventability Society Public and mass media interest Will to prevent Availability of data 1. Rationale for surveillance

  13. 2. Objectives of system • Documented? • If not = trouble • SMART? • Specific • Measurable • Action oriented[information] in order to [action] • Realistic • Time frame specified

  14. Possible objectives of surveillance • Detect outbreaks • Monitor trends (by time, place, person) • towards a control objective • as programme performance • as intervention evaluation • Estimate future disease impact • Collect cases for further studies ….in order to [action]

  15. Objectives ”To have a continuous overview of the spread of the disease in Norway in order to target preventive measures and plan resource needs.”

  16. 3. Operations of system • Health events under surveillance • Type of event:exposure -> infection -> disease / outbreaks -> outcome • Case definitions • Legal framework • Organisational framework • Components • Flow chart • Description

  17. The surveillance loop Health care system Surveillance centre Data Information Event Action Reporting Analysis, interpretation Feedback, recommendations

  18. Flowchart

  19. Components of system • Population under surveillance • Period of data collection • Type of information collected • Data source • Data transfer • Data management and storage • Data analysis: how often, by whom, how • Dissemination: how often, to whom, how Confidentiality, security

  20. 4. Resources for system operation • Funding sources • Personell time (= €) • Other costs • Training • Mail • Forms • Computers • ...

  21. Annual resource needs

  22. D. System performance

  23. Does it work? System attributes Simplicity Flexibility Data quality Acceptability Sensitivity Positive predictive value Representativeness Timeliness Stability Is it useful? Use of information Users Actions taken Link to objectives System performance

  24. Completeness Proportion ofblank / unknown responses Simple counting Validity True data? Comparison Records inspection Patient interviews ... Data quality

  25. Completeness of information

  26. Sensitivity • = reported true cases total true cases • = proportion of true cases detected

  27. Sensitivity versus specificity The tiered system: confirmed, probable, possible

  28. Measuring sensitivity • Find total true cases from other data sources • medical records • disease registers • special studies • Capture-recapture study

  29. Report Pos. specimen Clinical specimen Seek medical attention Symptoms Infected Exposed

  30. Special studies of sensitivity • 2500 patients with new hepatitis A or B tested (1995-2000) • no unreported HIV-cases • 70 000 pregnant women tested annually • 3-8 undiagnosed HIV-cases (immigrants)

  31. Timeliness

  32. Usefulness Health care system Surveillance centre Data Information Event Action

  33. Meeting objectives? • Was information produced? • Trends • Outbreaks • Future impact • Cases for further studies • Was information used, and by whom? • Actions: list • Consequences: list

  34. Usefulness • Ex 1 (mid 1990s): • Information: Aid workers infected in Africa • Action: Revision of recruitment policy • Ex 2 (1999): • Information: Men infected in Thailand • Action: Publication --> mass media interest --> = public health warning

  35. E. Conclusions and recommendations

  36. Conclusions • Proper rationale? • Attributes • Balance of attributes and costs • Fulfilling objectives? • Recommendations • Continue • Revise: specify • Stop

  37. F. Communication

  38. Communicating findings • To stakeholders • To data providers • To public health community • Report • Conference presentation • Scientific article

  39. Scientific publication • Introduction • Evaluation objective (B) • Material and methods • Methods of evaluation (B) • Results • System description (C) • System performance (D) • Discussion • Sources of error and bias • Conclusions and recommendations (E) • Acknowledgments • Stakeholders (A)

  40. Literature • CDC. Updated guidelines for evaluating public health surveillance systems. MMWR 2001; 50 (RR-13): 1-35 • WHO. Protocol for the evaluation of epidemiological surveillance systems. WHO/EMC/DIS/97.2. • Romaguera RA, German RR, Klaucke DN. Evaluating public health surveillance. In: Teutsch SM, Churchill RE, eds. Principles and practice of public health surveillance, 2nd ed. New York: Oxford University Press, 2000.

More Related