650 likes | 757 Views
Is It Working Yet? Evaluating and Creating Policy Changes for Complex Initiatives. Dr. Karen Linkins, IBHP, Tides Center Dr. Benjamin Miller, University of Colorado Dr. Lynda Frost, Hogg Foundation for Mental Health Dr. Becky Hayes Boober, Maine Health Access Foundation. Session Objectives.
E N D
Is It Working Yet? Evaluating and Creating Policy Changes for Complex Initiatives Dr. Karen Linkins, IBHP, Tides Center Dr. Benjamin Miller, University of Colorado Dr. Lynda Frost, Hogg Foundation for Mental Health Dr. Becky Hayes Boober, Maine Health Access Foundation
Session Objectives • 1) Become familiar with strategies to evaluate a complex health initiative; • 2) Explore strategies for advocating with policy makers; • 3) Understand how to use data related to quality health care interventions to create compelling messages; • 4) Gain insights on policy development and leveraging; and • 5) Share lessons learned and practical tools.
Mark Friedman Questions Performance Accountability Measures • How much did we do? • How well did we do it? • Is anyone better off? • Friedman, M. (2005). Trying hard is not good enough: How to produce measurable improvements for customers and communities. FPSI Publishing.
Performance Accountability Questions • Who are our “customers”? • How can we measure if our “customers” are better off? • How can we measure if we are delivering services well? • How are we doing on the most important of these measures? • Who are the partners who have a role to play in doing better? • What works to do better, including no-cost and low-cost ideas? • What do we propose to do?
“Customer” Satisfaction • Did we treat you well? • Did we help you with your problem?
Evaluation Focus • Performance Accountability Questions • Population Accountability Questions
What is the story behind these data? What are the stories that can influence policy?
Fighting fragmentation at the level of innovation: Advancing the field of integrated primary care Benjamin F. Miller, PsyD Director of the Office of Integrated Healthcare Research and Policy Department of Family Medicine University of Colorado Denver School of Medicine
The problem(s) Sometimes in the face of innovation we lose sight of our ultimate goal – to change healthcare. We focus on the problems rather than recognize what is working. We focus on meeting immediate needs (e.g. financial) rather than plan for long term success. We slip into “protective mode” and forget why we started the innovation to begin with. We stop seeing the other innovators around us and focus on ourselves rather than the larger community or larger field.
State of the field But first
Brilliance Brilliance Brilliance Brilliance Brilliance
Fragmentation as a Parallel Process • What we do (models) • What data we collect (clinical) • What we call ourselves (integrated) • What we need for sustainability (money) • Who we talk to (ourselves) • What we want (change)
What’s the problem? • Measuring integrated mental health (what is that exactly?) • There is no gold standard “tool” • Consistency across sites (e.g., documenting mental health diagnosis) • The evidence is lacking and the field is in need of knowledge around the “elements” • HUGE scope • Financial sustainability (or the business case)
Mental Health Presentation Range Mental Health and Substance Use Presentations Medical issues with psychosocial barriers to care Medical issues requiring behavioral or psychological intervention Mental and Physical Health Multimorbidity Severe Mental Illness and/or Substance Abuse Psychosocial Support Services Behavior Change Education & Evidence-Based Treatments Mental health treatment plan Coordination of mental and physical health treatment plans Full coordination with specialty care Example Targeted Service Response
What’s the problem? – the money issue • Two pots of money • Workarounds are often viewed as the solution • We don’t know what we don’t know (but we think we know what we don’t know) • Turf wars and bad feelings
What we need to consolidate (or integrate) Clinical data Language Financial data What we measure How we track and measure what we do Better community connections and state to state connections (and collaborations) Shared and consistent evaluation plans for integration projects
Studying time Case study
Miller, B. F., B. Teevan, et al. (2011). "The importance of time in treating mental health in primary care." Families, systems & health : The journal of collaborative family healthcare 29(2): 144-145.
What can be tracked and learned • Time spent with patient • Time spent with other providers • Assigning monetary amounts to time (and or patient volume) • Assessing changes in time and volume • Assessing value and outcomes • Learning about what patients use more time and benefit from integrated initiatives
Studying screening Case study
Phillps, R. L., B. F. Miller, et al. (2011). "Better Integration of Mental Health Care Improves Depression Screening and Treatment in Primary Care." American Family Physician 84 (9): 980.
What can be tracked and learned • Number of patients identified • Number of patients treated • Number of patients who improve from treatment • Comparing rates of identification to rates of diagnosis (accuracy) • Using screening tools repeatedly for treatment tracking
We must In summary
non-negotiable? • Be heard • Know what policy solutions can help lead to sustainability (including financial) • Begin to collect some of the same data • Make sure our data are put into the medical record in such a way it can be extracted • Have an entity that can pull it all together • Be compelling, be accurate, be timely
Benjamin.miller@ucdenver.edu@miller7occupyhealthcare.net Thank you
Evaluating Complex Initiatives: Lessons Learned for Sustaining Change and Influencing Policy Karen W. Linkins, PhD Project Director Integrated Behavioral Health Project Tides Center
Systems Change: Key Goal of Complex Initiatives “Change is disturbing when it is done to us, but exhilarating when it is done by us” (Elizabeth Moss Kanter, Professor, Harvard Business School) • Many different definitions of systems change exist, but they share common elements: policies and practices, resources, relationships, power and decision-making, values, attitudes, skills, governance, and supportive policies and reforms. • Systems change is dynamic, developmental, non-linear, and complex. • The target of change is the system, not the individual.
Definition of Systems Change • System change is defined as: changes in organizational culture, policies and procedures within and across organizations that enhance or streamline access, and reduce or eliminate barriers to needed services by target populations.
What does sustainable systems change look like in integrated care? • Changes that endure beyond the funded project that lead to any or all of the following: • Increased Access • Improved Quality • Enhanced Efficiency • Increased Consumer Empowerment
Factors in Designing Evaluations of Complex, Systems Change Initiatives • Stakeholder interests • Initiative goals, including desired outcomes and impacts • How findings will be used, e.g.: • Educate policy makers • Disseminate best practices • Change local systems and policies • Support sustainability plans and garner new funding sources • Available resources for the evaluation
Different stakeholders are interested in different outcomes • Providers: Individual patient outcomes, panel management • Clinics/Clinic Systems: Population health management, administrative metrics (e.g., cycle times, provider productivity, patient and provider satisfaction), billing, culture change • Policy Makers: Cost and other administrative metrics • Community: Prevention, community health and wellness, healthy behaviors, consumer engagement • Foundations: Alignment with strategic priorities, return on investment, grantee accountability
CDC Evaluation Framework • Step 1: Engage stakeholders • Step 2: Describe the program • Step 3: Focus the evaluation design • Step 4: Gather credible evidence • Step 5: Justify conclusions • Step 6: Ensure use and share lessons learned
Key questions to Guide Evaluation Design (CDC) • What will be evaluated? (program, context) • What aspects of the program will be considered in assessing program performance? • What standards (i.e., type or level of performance) must be reached for the program to be considered successful? • What evidence will be used to indicate how the program has performed? • How will the lessons learned be used to improve public health effectiveness?
Evaluation Design Considerations • Design types: experimental, quasi-experimental, and observational designs. • No design is better or best in all circumstances. • Design and methods should be matched to the interests of targeted stakeholders (e.g., foundation, grantees, policymakers).
Considerations (cont.) • Design drives what counts as evidence, how data are gathered, what claims can be made, who needs to be involved, and what data management systems are needed. • Mixed method designs are most effective because each method has biases and limitations. • During the course of an evaluation, methods might need to be revised or modified.
Challenges and Threats to Evaluating Complex Initiatives • Complex initiatives require significant investments of time, resources and energy to create common ground for change. • Programs often become so focused on immediate implementation issues (client “fixes”), the long-term vision for systems change becomes lost or deferred. • Balancing the funder’s need for accountability/rigor in reporting with developing and maintaining authentic relationships with grantees.
Challenges and Threats (cont.) • Data collection must be relevant. • Data should not be collected unless they are shared and fed back to those responsible for collection. • Evaluation should be clearly connected to longer term outcomes. Failure to do so limits buy-in, understanding, and a greater sense of accountabilitytothe process.
Case Example: Integrated Care Initiative Initiative Goals: Create a more responsive and integrated system of care to increase access and reduce costs for individuals with co-morbid conditions (MH & chronic conditions) • Patient focused • Address patients’ needs, improve health outcomes • Reduce reliance on ED resources for care that is more effectively provided in less costly, community-based settings • System Focused • Reduce ED volume and diversion time, and avoidableinpatient use • Encourage financing and policies that promote coordinated, cross system, multidisciplinary care and integration of services
Foundations • Project Officers/Program Staff • Policy Staff • Evaluation Staff • Oversight • Group Program Office Evaluation Team • Grantees & Collaboratives • Community-Based Organizations • Hospitals • Public Health, Housing/Homeless Programs, Mental Health, Substance Abuse, MediCal, Criminal Justice Stakeholders Influencing Evaluation Process
Evaluation Design • Participatory approach • Three evaluation phases of the evaluation • Planning • Implementation Process • Outcomes and Promising Practices (“What Works”) • Multi-level, pre-post design
Frequent Users Initiative Intermediate Outcomes/Changes Long Range Impacts Interventions Enrolled TP Clients • Outcomes • Service utilization • Costs Planning Grants Service Delivery Change • Client-based: Compare enrolled clients & TP at beginning and end of grant period (utilization and cost) • System-based: MIS analysis of changes in the patterns of service utilization and costs system wide Other Activities Organizations • Meetings/Convenings • Other activities • Policies and practices • Data systems • MOUs • Changes in services Implementation Grants(e.g., Intensive Case Management) • Structure • Intensity County System • Data systems • Financing • Collaborations • New services • Restructuring Broad Systems Change • County • State Broader FUI Initiative • Policy papers • Other activities State Level TP = Target Population • Laws and regulations • Budget and financing
Evaluation Outcomes Measures • Cost and utilization (ED, inpatient and other systems as available) • Clinical measures of health and functioning • Stability (e.g., income and insurance enrollment) • Service intensity (frequency and duration) • Strength of partnerships and collaborations • Policy and systems change (evidence of improved coordination, streamlined access, permanent policy changes to address/eliminate barriers)
Evaluation Challenges • Participatory orientation • Balancing research rigor with “what’s reasonable and feasible” – selecting outcome measures and data collection strategies that matched capacity and didn’t over burden staff • Developing and maintaining meaningful stakeholder participation (on-going communication) • Establishing and maintaining trust of programs to ensure buy-in and data integrity • Defining/operationalizing multi-level outcomes • Ensuring evaluation findings aligned with and relevant to information needs of various stakeholders – at the “right time”
Evaluation Challenges (cont.) • Client centered interventions: challenge of programs/ models balancing individual client “fixes” vs. permanent programmatic and systems change • Data accuracy and consistency • Data availability and linkage capability • Mis-match of Foundation and Grantee Goals -- Foundations wanted systems and policy change, but funded local interventions
Despite the Challenges . . . • Findings were compelling and rigorous enough to use for policy development (Medicaid Waiver and other legislation). • The combination of quantitative and cost data, as well as qualitative process and outcome data created a strong and policy relevant story of sustainable systems change.
Data stories can influencepublic policy. Lynda Frost Director of Planning and Programs Hogg Foundation for Mental Health
Background Research andEvidence-Based Practices • 20 years of research on collaborative care model framed grant program on integrated healthcare • Large conference highlighted research and grantees’ work • Grantees engaged in advocacy around reimbursement, other issues • Evaluation of grant program gathered state-specific outcome data and identified barriers to implementation