560 likes | 638 Views
Practical Tools for Measuring and Improving the Quality of Public Health Preparedness Christopher Nelson, Ph.D Michael Seid, Ph.D Julia E. Aledort, Ph.D David J. Dausey, Ph.D. Acknowledgements. William Raub, Project Officer, HHS Lara Lamprecht, HHS Jeffrey Wasserman, Co-PI, RAND
E N D
Practical Tools for Measuring and Improving the Quality of Public Health Preparedness Christopher Nelson, Ph.DMichael Seid, Ph.D Julia E. Aledort, Ph.D David J. Dausey, Ph.D
Acknowledgements • William Raub, Project Officer, HHS • Lara Lamprecht, HHS • Jeffrey Wasserman, Co-PI, RAND • Nicole Lurie, Co-PI, RAND
Project Overview • RAND has worked on a variety of related projects with HHS for the past 3 years • The overarching theme of these projects has been to develop resources and to prepare analyses to describe and enhance key aspects of state and local public health preparedness
Measuring Public Health Preparedness • Accountability • Quality Improvement • Evaluation • Standardization
Federal Policy and Local Quality Improvement in Public Health An Assessment of CDC’s Cooperative Agreement Guidance Christopher Nelson, Ph.D.
Motivation • CDC’s Cooperative Agreement on Public Health Preparedness and Response for Bioterrorism is main federal funding vehicle for PHEP • The agreement includes guidance and performance measures • The guidance has been developed without the benefit of a broad conceptual framework
Objectives • Assess strengths and weaknesses in current and recent guidance as a tool for supporting state and local quality improvement • Develop a framework and recommendations to guide improvements in guidance
Conceptual Framework Guided Analysis • Evaluation of current system requires characterization of the system • Three main policy instruments
Conceptual Framework Guided Analysis • Evaluation of current system requires characterization of the system • Three main policy instruments • Designed to influence grantee activities and PHEP
Research Methods • Methods • 10 Site visits to state health departments and a small number of local health departments (interviews, collect materials) • Interviews with federal officials, national groups, and other stakeholders • Review of literature on federal guidance, measuring preparedness in other disciplines • Considerable changes from 2004 to 2005 guidance • Greater focus on capabilities vs. infrastructure • Performance standards target more “downstream” outputs • Fewer performance goals (34 vs. 186) • Report focuses on 2005 materials, but also draws upon earlier materials where doing so provides important lessons learned
Overview • Overall evaluation • Standards • Assessments • Feedback/consequences • Grantee use of data
Overall Assessment of New Guidance • Most respondents applauded the changes represented in the 2005 guidance • More outcomes-oriented • Provides more flexibility about how to reach standards • Addresses concerns about one-size-fits-all approaches “The new guidance is better. It is more outcomes oriented and allows us to tailor an approach for [my state].” “Before they [CDC] were setting the standard and you had to hit it, whether you needed it or not.” • Also seems more congruent with current state of knowledge about PHEP
Standards • Frequent changes in standards create transaction costs and limits their utility as planning tools • Some change in response to changing knowledge base • But considerable change in format, categorization • Timelines are unrealistic • Encourages haphazard investments
Assessments • Grantees generally pleased with move toward exercises and drills in 2005 guidance, but concerns remain • On the front end, exercises and drills are often customized to local improvement needs. Can they yield comparative judgments about preparedness? • On the back end, methods for evaluating exercises and drills are not well standardized (e.g., the problem of “when to stop the stopwatch”) • Lack of clarity about how to aggregate local preparedness data • Potential source of inconsistency
Feedback and Consequences • Inconsistent experience with CDC feedback on performance reports • “I think someone reads it in Atlanta, but I get very little feedback.” • Unclear consequences associated with performance creates uncertainty for grantees • Generally, states don’t think that funding will be cut as a consequence of poor performance • But some believe that with decreasing funding there will be more competition for scarce resources • Concerns about incentives for maintaining capabilities • Will the funding stop once a standard is met?
Grantee Use of Guidance • Release of guidance often comes too late to influence early planning processes • Generally, little evidence that grantees use information gathered for performance reports for improvement purposes
Examples of Strategies for Improving CDC Guidance • Develop norms about changes in guidance • Well-publicized, periodic top-to-bottom review • Between reviews, require demonstration of compelling need to make changes • Publicize changes ahead of time for review and comment (cf. NPRM model) • Develop a small number of standardized drills to facilitate comparisons across jurisdictions and over time • Develop wider range of low-cost exercise and drill formats • Small-scale timed drills (allow element of surprise?) • “Embedded” assessments • Clarify incentives associated with assessments
Quality Improvement: Implications for Public Health Emergency PreparednessMichael Seid, Ph.D.
Motivation • ‘Preparedness’ difficult to measure, but improvement is necessary • Quality Improvement (QI) has been useful in other sectors • QI may be a useful tool for improving preparedness
Objectives • Develop framework for QI in PHEP • Identify examples of QI in PHEP • Identify barriers and facilitators of QI for PHEP • Make recommendations for QI for PHEP
What is QI? • Core concepts • Emphasis on systems • Product or outcome focus • Data driven • QI focuses on efforts to reduce unwarranted variability • QI efforts must be ongoing, rather than one-offs • Four elements – all necessary • Performance Goals • Measures • QI practices • Feedback/reporting
Implementing QI practices is iterative From the Institute for Healthcare Improvement, http://www.ihi.org/IHI/Topics/Improvement/ImprovementMethods/HowToImprove/rampsofchange.htm
QI Barriers and Facilitators • Skills • Technical, managerial, strategic • Organizational Culture • Openness, collaboration, learning from mistakes • Leadership • Incentives • Organization and individual • Financial and nonfinancial
Capability-Building Processes Develop policies and plans Assure competent workforce Response Processes Inform, educate, empower people Mobilize community partnerships Link to provider and assure care Enforce public health laws Outcomes / Recovery Morbidity Mortality Psychological Social Economic Surveillance and Detection Processes Monitor health of the community Diagnose and investigate Applying QI to PHEP: Conceptual Framework • A preparedness ‘production’ system
Methods, Sample • Methods • Sample Identification, screening • Site-visits • Data analysis • Extract themes from detailed interview notes • Sample • 90 HDs nominated, 9 chosen for site visits • 5 Local HDs: 1 large urban area, 3 medium, 1 small • 4 State HDs: 1 centrally organized, 2 decentralized, 1 mix • 3 of 9 HDs on West Coast, 2 Mid-West, 2 South, and 2 Northeast
Results (1) • Performance goals • Performance goals are being used • Issues with definition, relevance, consistency. • Performance goals sometimes structured correctly • Measures • Concerns about relevance of measures • Measurement not pervasive, nor documented • Examples of measures for routine processes • Examples of measures for rare/response
Results (2) • QI strategies • Area with most difficulty • Examples • Cyclical improvement (PDSA) • Collaborative • Incident Command System • Role of state vs local HD • Feedback and reporting • Few examples of process change as a result of measurement • What happens after the ‘after action report’? • Some promising starts
Results (3) • Barriers and Facilitators • Organizational culture and leadership are key • Incentives often a problem • Resources: lack of time, money, and staff
Summary • While no site was fully engaged in QI, components of QI existed at every site • Implementing QI strategies was most difficult QI element • Promising practices exist and suggest that this is possible • Leadership and culture are key facilitators • Lack of resources a problem, but bigger problem is lack of incentives
Recommendations • QI can be applied to all parts of PH, not just PHEP • QI must be incorporated into daily work to avoid preparedness burnout • QI training is necessary • States should facilitate local QI via measurement, training, collaborative-building • Expand cyclical QI approaches to more processes • Inject ICS into ongoing work • Systematize the after-after action report • The right incentives are key
Facilitated Look-Backs: A New Quality Improvement Tool for Management of Routine Annual & Pandemic Influenza Julia E. Aledort, Ph.D
Motivation • Pandemic influenza public health preparedness challenged by the relative infrequency of pandemics • Routine annual influenza provides important opportunities for state public health agencies (SPHAs) to learn from direct experience • Lessons from 2004-2005 influenza vaccine shortage may be relevant for pandemic influenza preparedness
Objective • Develop a tool for state health departments that enables them to be able to: • Regularly revisit & evaluate routine annual influenza management with key community stakeholders • Systematically institutionalize knowledge from one influenza season to the next • Continually improve public health response to routine annual influenza • Identify & incorporate lessons into preparedness activities for pandemic influenza & other public health emergencies
What is a “Look-Back”? • Convenes SPHA leaders, key staff & community stakeholders after each influenza season • Facilitates candid, “no-fault” systems-level discussion of annual influenza management • Reviews past real-world events & critically examines how participants responded • Key events that unfolded during the past influenza season • Key decisions that were made by stakeholders • How decisions perceived & acted upon by others • Draws on practical experience & broad range of perspectives to inform future responses
Look-Back Operating Assumptions • Annual influenza activities offer recurrent lessons for some aspects of pandemic influenza & other public health preparedness activities • No one person can represent all perspectives about the past • Systems-level analyses can identify opportunities to learn from experience • Organizational learning is not complete without improvement plans that specify responsibilities for change
Look-Back Pilot Tests • Designed & piloted with three SPHAs in different US regions between June & August 2005 • Focused on topics identified in collaboration with SPHA • Involved 10-25 participants, including SPHA departments, healthcare partners, other community stakeholders • Lasted 3 to 5 hours • Resulted in After Action Plans (AARs) and initial Improvement Plans
Look-Back Discussion Topics 1. Organizational Structure of Decisionmaking 2. Influenza Surveillance 3. Vaccine Procurement and Distribution 4. Routine Annual Influenza Vaccination Campaigns 5. Vaccine Administration 6. Priority Groups & Implications of Changing Priorities 7. Non-Pharmacological & Public Health Strategies 8. Communication 9. Unanticipated Events
Core Discussion Questions • What are activities, roles & responsibilities during annual influenza season? • What are specific issues that emerged last year? • What went well & are past successes sufficiently institutionalized? • What specific problems emerged? • What might have been done differently? • What should be done differently in the future? • What are lessons for an influenza pandemic?
SPHA Officials & Staff State health director Emergency management coordinator Immunization program director Pandemic influenza coordinator Communicable disease control/disease investigation director Quality Improvement Coordinator State epidemiologist Public health nurse Communications specialist/public information officer Other Community Partners District or LPHA staff Hospital representatives Nursing home & LTC representatives Professional medical organizations Managed care organizations Insurers Commercial enterprises offering influenza vaccine to the public Pharmacies Minority community leadership representatives Look-Back Participants
Design issues & Implementation Challenges • Advanced planning & investigation allow for customized Look-Backs • Facilitator objectivity and independence are critical • It is a challenge to produce effective & broadly agreed-upon AARs • AARs can generate valuable dialogue if they are broadly disseminated & reviewed by individuals not typically involved with annual influenza activities
Examples of Lessons Learned • Leveraging state emergency management resources & infrastructure may facilitate emergency response by “traditional” state public health agencies • Communication is of paramount importance • Broad-based coalitions & public-private partnerships may mitigate vaccine distribution & administration challenges
Conclusions • A Look-Back is a relatively simple, effective quality improvement tool • Look-Backs can be used to assess recent past events & identify issues relevant to future annual & pandemic influenza • Adoption & implementation of Look-Backs capitalizes on annual influenza to better prepare for pandemic • Document & formalize learning from successes & problems • Encourage follow-through on lessons learned • Reinforce the role of public health during annual & pandemic influenza
Common Themes and Lessons Learned from Designing and Conducting Tabletop Exercises to Assess Public Health Preparedness David J. Dausey, Ph.D
Motivation • The US government has made substantial investments to enhance the nation’s ability to respond to bioterrorism and other public health emergencies. • Funding and mandates for federal preparedness programs have led health departments throughout the country to implement exercise programs. • The development and conduct of exercises to test and assess preparedness is now considered the responsibility of all health departments.
Objective • To summarize insights that the RAND Corporation has gained about public health preparedness and the process of developing, conducting, and evaluating tabletop exercises in collaboration with state and local health departments in every region of the United States.
What Are Tabletop Exercises? z • Full Scale Exercise • Functional Exercises • Drills • Tabletops • Workshops • Seminars Capabilities Operations Based Planning/Training Discussion Based Source: www.hseep.dhs.gov
Exercise DevelopmentProcess Goal: Enhance Public Health Preparedness Make Improvements Develop action plan Create after action report Conduct exercise Revise the exercise Review exercise with stakeholders Develop a draft tabletop exercise Meet with key actors and leaders; clarify exercise goals Understand existing plans, actors, and system
Public Health Infrastructure Public Health Preparedness Public Health Methods Exercise Performance Measurement Subject Matter Expertise Exercise Development Exercise Facilitation Exercise Customization Exercise Development Requires Multiple Levels of Expertise and Experience
Tabletop Pilot Tests • Convenience sample of 30 local public health agencies in 13 different states across the continental US • Majority were located in urban areas that served populations of less than 1 million residents • Conducted from 2003 to 2006 • 1 exercise per site
Design Concerns • Competing desires for realistic scenarios and logistic feasibility • What are the objectives of an exercise? • What is the nature and scope of the exercise scenario? • Who should attend? • How should the exercise be facilitated?