1 / 28

What is needed to accelerate progress? Where have we succeeded, and why?

Partnering for Improvement across Research , Practice and Policy: The Case of Implementation Research in Health. Brian S. Mittman, PhD Director, VA Center for Implementation Practice and Research Support VA Greater Los Angeles Healthcare System 16 August 2011.

rico
Download Presentation

What is needed to accelerate progress? Where have we succeeded, and why?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Partnering for Improvement acrossResearch, Practice and Policy:The Case of Implementation Research in Health Brian S. Mittman, PhDDirector, VA Center for Implementation Practice and Research SupportVA Greater Los Angeles Healthcare System16 August 2011

  2. Increased activity, investment and success in implementation and implementation research are critical forachievement of key societal goals • in child welfare • in education • in criminal justice • in health and health care • etc.

  3. Contributions from implementation research (in the form of useful insights, guidance and effective strategies) have been modest, at best

  4. What is needed to accelerate progress? • Where have we succeeded, and why? • Where are we still struggling? Why? • What can we do as researchers, policy leaders, practice leaders to strengthen our field and enhance our success?

  5. Assessing the state of the field of implementation: Implementation in health and health care Why health and health care? • Extensive history (changing physician behavior, quality improvement, implementation science) • Considerable international investment (despite small percentage of the NIH budget) • Societal and policy interest (waves of reform) • Practice interest and demand • Notable successes (large projects, entire systems, e.g., US Dept Veterans Affairs) • Dedicated journals and special issues, conferences

  6. Basis and sources for my observations • VA Quality Enhancement Research Initiative (QUERI) • Implementation Science • NIH Special Emphasis Panel, Dissemination and Implementation Research in Health (2006-2010) • NIH Conference on the Science of Dissemination and Implementation (2007-2011) • Institute of Medicine (IoM) Forum on the Science of Quality Improvement and Implementation (2007-2008) • Additional conferences, reviews, advisory groups

  7. Outline of themes, observations • Features of target evidence, practices and innovations • Composition of research programs, pipelines, portfolios • Features of implementation trials • Additional features of research • Policy and practice leader roles and responsibilities

  8. Features of evidence, best practices, innovations • Efficacy vs. effectiveness research • External validity, generalizability, transferability • Practical/pragmatic clinical/behavioral trials • Designing for dissemination: “reach” goals vs. near-term goals • Partnership research, PBRNs

  9. Composition of research programs, portfolios and sequences of research activities + Basic science  + Pre-clinical research (T-1)  + Clinical efficacy research  + Clinical effectiveness research  + Variations research (quality/implementation gap documentation and quality gap diagnosis)  + Implementation trials (Phases 1 to 4)+

  10. Health research pipeline, major phases Clinical Improved Health Processes, Outcomes TranslationalPre-ClinicalResearch Implemen-tationResearch Effective-ness Studies Basic Science Health Behavior Health Services

  11. Health research pipeline, implementation-related phases Efficacy Studies Clinical Document and diagnose gaps Effectiveness Studies Interventional Implementation Studies Health Behavior Observational studies of implementation Health Services Practice Guidelines Phase 4 “Post-Marketing” Monitoring, Refinement Phase 2 Small-Scale Efficacy Trials Phase 1 Pilot Projects Phase 3 Large-Scale Effectiveness Trials

  12. Pipeline and portfolio gaps • Absence of quality/implementation gap documentation and diagnosis; “empirical treatment” • Observational studies (natural experiments) vs. experimental/interventional studies (artificial implementation initiatives) (see next slides) • Implementation efficacy trials not preceded by pilot studies, local diagnosis, formative evaluation • Efficacy-oriented implementation trials not followed by effectiveness, sustainability, scale-up/spread studies (see next slides)

  13. Composition of research programs, portfolios and sequences of research activities • Accelerating a 20-30 year process • Hybrid studies combining phases • Simulation, other modeling approaches

  14. Features of implementation trials: efficacy vs. effectiveness; local implementation vs. scale-up Artificial features of local implementation trials: • Research team technical assistance, consultation • Research team-employed, supervised staff • Research grant funding for IT, staff, training • Hawthorne effects • Selective sampling Effectiveness, observational, sustainability, scale-up/spread studies lack these features

  15. Features of implementation trials:“selective” vs. “biased” sampling The representative sampling paradox in implementation research: • Estimating implementation success in 3 years vs. now • Sampling to represent future vs. current conditions Implementation phenomena are different  implementation research approaches, designs and methods must differ

  16. Studying complex social interventions Implementation strategies and programs are complex social interventions characterized by: • Variability and heterogeneity of program (intervention) content across time and place • Heterogeneity of program implementation across time and place • Strong contextual influences (leadership, culture, experience/capacity, staff/budget sufficiency), variability and heterogeneity of context across time and place • Weak main effects (other than for robust programs)

  17. Studying complex social interventions • Robust CSIs are amenable to RCTs to estimate mean effect sizes (and the strength of a small number of contextual influences) • We prefer to study robust CSIs because “that’s where the light is” • The value and applicability of methods for estimating “effectiveness” decreases with increases in the • magnitude of contextual influences • degree of heterogeneity and variability of programs and settings increases • and with decreases in the main effect size

  18. Studying complex social interventions:What is our goal? Two very different questions 1. Does it work? Is it “effective”?Should it be approved? Included in the formulary?Should I use it? 2. How, why, when and where does it work?How should I use it?How do I make it work? For many/most implementation strategies, Q1 is meaningless

  19. Developing insights and guidance for implementation • How do I choose an appropriate implementation strategy given my context? • How do I implement (deploy) that strategy to increase effectiveness? • How do I adapt and customize that strategy to increase effectiveness (initially and over time)? • How do I modify/manage the organization or setting to increase effectiveness (initially and over time)? • How, why, when and where does it work?

  20. Developing insights and guidance for implementation Selecting research approaches, designs and methods • Trials facilitate effectiveness estimates; observational studies facilitate study of barriers, facilitators, mechanisms, mediators, moderators • Process evaluation can develop insights into mechanisms • Theory-based evaluation, realistic evaluation, related approaches from program evaluation offer additional value • Guidance in selecting, applying and further developing these approaches is needed • Implementation strategies exist on a continuum; research approaches should be matched to their features

  21. Fidelity vs. adaptation • Complex social interventions can be adapted and customized to increase effectiveness. They should be adapted and customized. • Complex organizations can be managed to increase effectiveness. They should be managed. • Implementation research should generate guidance for implementation as a process, in addition to (and often instead of) producing effect size estimates to guide one-time selection decisions.

  22. Other features of implementation research • Incomplete documentation and reporting: implementation program details, process evaluation, mechanisms (mediators, moderators), contextual factors • Insufficient use of theory, relabeling and re-inventing theories and frameworks • Insufficient attention to prior research and to synthesis: frameworks, theories, mechanisms, observational study insights, etc. – in addition to trial effect sizes • Poor portfolio management: imbalance in funding • Heterogeneous, diverse terminology

  23. Other features of implementation research (cont.) • Consensus regarding measures (of outcomes, contextual factors, mediators/moderators) • Collections of validated instruments • Consensus regarding methods (e.g., standards for process evaluation, qualitative research)

  24. Other challenges to implementation & research • Multi-level, multi-factor barriers, facilitators, mechanisms • Need for multi-faceted strategies and continuous attention (vs. “that didn’t work; let’s try something else” and the quest for “the answer”) • Leaders’ (and researchers’) limited influence over key levels • Requirement for full engagement and partnerships

  25. Guidance for partnership research • Partnerships are needed to reach informed decisions regarding research priorities, to design and develop clinical efficacy and effectiveness studies, to design and conduct implementation studies • Partnership research approaches are numerous and diverse; we lack guidance in selecting or developing hybrid approaches • community-based participatory research • practice-based research networks • action research • engaged scholarship • community engagement

  26. Policy and practice leader roles and responsibilities • Manage actively and appropriately (plan, decide, monitor, refine) • Invite, facilitate, partner in research • Advocate for greater investment in research • Document research experiences • “No new programs or directives without accompanying implementation guidance” • “No new programs without accompanying evaluation”

  27. Contributions from implementation research (in the form of useful insights, guidance and effective strategies) have been modest, at best. Much of the implementation research we conduct is pursuing the wrong questions and generating the wrong types of output.

  28. More effective implementation requires insights, guidance and tacit knowledge -- not evidence, answers to specific questions and explicit knowledge. Implementation research must evolve to generate these insights and tacit knowledge and must determine how best to convey this knowledge to researchers and to practice and policy makers.

More Related