370 likes | 971 Views
The Shifting Context for Dissemination & Evaluation in Translational Research. If We Want More Evidence-based Practice, We Need More Practice-Based Evidence. UCSF Translation-2 Course, Oct 30, 2008 Lawrence W. Green University of California at San Francisco.
E N D
The Shifting Context for Dissemination & Evaluation in Translational Research If We Want More Evidence-based Practice, We Need More Practice-Based Evidence UCSF Translation-2 Course, Oct 30, 2008 Lawrence W. Green University of California at San Francisco
NIH Roadmap Initiative--translating discoveries into health* The roadmap less traveled?** “The Roadmap identifies the most compelling opportunities in three arenas: new pathways to discovery, research teams of the future, and reengineering the clinical research enterprise” (Zerhouni, p. 63).* **Green LW. Am J Prev Med., 2007; 33(2):137-38, after K. Grumbach. *Zerhouni E. Science 2003, Oct 3;302(5642):63-72 .
"Blue Highways" on the NIH Roadmap* Program Evaluation, CQI, Policy Analysis, TA in EBP adaptation. Systems research. *Westfall, J. M. et al. JAMA 2007;297:403-406.
Where Have All the Data Gone? Longtime Passing… 17 yrs “It takes 17 years to turn 14 per cent of original research to the benefit of patient care” * Original research Submission Unknown 0.5 year 0.6 year 0.3 year 6. 0 - 13.0 years 9.3 years Poyer, 1982 Antman, 1992 Kumar, 1992 Kumar, 1992 Acceptance Publication Bibliographic databases Reviews, guidelines, textbooks Implementation 35% 50% Negative results 18% 46% Lack of numbers, Design issues Lack of numbers, Design issues *Balas, 1995 Inconsistent indexing Poynard, 1985 Dickersin, 1987 Koren, 1989
R The Pipeline Fallacy of Producing & Vetting Research to Get Evidence-Based Practice* The 17-year odyssey Practice - Guidelines for Evidence-Based Practice Funding; patient needs, demands; local practice circumstances; professional discretion; credibility & fit of the evidence. Research Synthesis Publication Priorities & Peer Review Peer Review Of Grants Priorities for Research Funding Evidence-based Medicine movement Academic appointments, promotion, & tenure criteria Blame the practitioner or blame dissemination *Based on Green, L.W. From research to “best practices” in other settings and populations. Am J Health Behavior 25:165-178, April-May 2001. Full text: www.ajhb.org/25-3.htm
The research indicates that we really should do something with all this research. Diffusion Adoption Quality EBP “Bridging the gap”
5 Ways of Making Research More Relevant for Practice • Making research more theory-based • Setting research & evaluation priorities • Making research findings actionable, usable, relevant (to whom?) • Disseminating & translating (adapting) research to local circumstances, cultures, and personnel • Making evidence more practice-based
Filling the Chasm, as Conceived by the U.S. Translation Agency* Practice is We want it to here be here Innovation Implementation Reminiscent of the “Fallacy of the Empty Vessel” from early health education Adoption Education Diffusion TRIP *Carolyn Clancy. Agency for Healthcare Research & Quality 2003.
R The Pipeline Fallacy of Producing & Vetting Research to Get Evidence-Based Practice* The 17-year odyssey Practice - Guidelines for Evidence-Based Practice Funding; patient needs, demands; local practice circumstances; professional discretion; credibility & fit of the evidence. Research Synthesis Publication Priorities & Peer Review Peer Review Of Grants Priorities for Research Funding Evidence-based Medicine movement Academic appointments, promotion, & tenure criteria Blame the practitioner or blame dissemination *Based on Green, L.W. From research to “best practices” in other settings and populations. Am J Health Behavior 25:165-178, April-May 2001. Full text: www.ajhb.org/25-3.htm
RFA (PAR) from NIH, 2006 • Applications to “ identify, develop, and refine effective and efficient methods, structures, and strategies that test models to disseminate and implement research-tested health behavior change interventions and evidence-based prevention, early detection, diagnostic, treatment, and quality of life improvement services into public health and clinical practice settings.” • Two problems with these framings of the issue: • Are the “research-tested interventions” adequate? • Are they appropriate to other settings, populations? To illustrate the first problem:
Canadian Cancer Society RFP for a Review to Answer 4 Questions Are group counseling programs for smoking cessation effective? If so, what is the optimal content of the sessions? What is the optimum number and frequency of sessions that should be offered? What are the characteristics of the most effective facilitators?
University of Waterloo Results* • A comprehensive literature review of over 40 years of published and unpublished studies • Deficiencies in purpose, design and reporting • Research could answer only the first of 4 questions: that group programs for smoking cessation are effective. *Manske SR, Miller S, Moyer C, Phaneuf MR, Cameron RC. Best practice in group- based smoking cessation: Results of a literature review. AJHP 18:409-23, 2004.
Evidence-Based Medicine and Patient-Centered Medicine* Information of importance to patient choice that is not even potentially of “evidence-based type.” Area where there is currently good evidence-based information of importance to patients in making choices. C A A “Good evidence” B Potential for “good evidence” C Information of potential importance to patients in making health care choices Information of importance to patient choice that is potentially of evidence- based type. B *In A.L. Cochrane, from T.Hope. Evidence-based patient choice and the doctor-patient relationship. In But Will It Work, Doctor? Kings Fund, London, 1997, 20-24.
Issues for Evidence-Based Practice and Translating Research to Practice • Making practice more theory-based • Setting research priorities • Making research findings actionable, usable, relevant within settings • Translating research from outsideto local circumstances, cultures, personnel • Making evidence more practice-based
Priority-Setting for Health Research* Population Level • Program • Evaluation CDC • Community & Statewide • Effectiveness Trials • Surveillance PBRNs, CQI Applied Research & Development Clinical Trials Basic Research Demonstration & Education Research Clinical Investigations NIH Molecular Level T2 T1 Knowledge Acquisition Knowledge Transfer Knowledge Translation Knowledge Validation *Green LW, Popovic T, et al. CDC Futures Workgroup on Research. Atlanta, 2004.
The Internal Validity Drift of Health Sciences Evidence “Lost in Translation” • Evidence-based medicine movement taken to scale in general practice & health promotion • The peer review preferences for experimental control and certainty of causation • The publishing preferences for RCTs and positive results • The limitations of print space driving out richer description of interventions, protocols, procedural lessons, subgroup variations • But a more “natural” type of practice-based evidence has greater influence on multi-level program planning, practice & policy…
Change in Per Capita Cigarette ConsumptionCalifornia & Massachusetts vs Other 48 States, 1984-1996 5 0 -5 Percent Reduction -10 -15 -20 -25 Other 48 States California Massachusetts 1984-1988 1990-1992 1992-1996
Issues for Evidence-Based Practice and Translating Research to Practice • Making practice more theory-based • Setting research priorities • Participatory research to make findings actionable, usable, relevant within settings • Translating research from outsideto local circumstances, cultures, personnel • Making evidence more practice-based
Some Benefits of Participatory Research in Practice-Based Evidence • Results are relevant to interests, circumstances, and needs of those who would apply them • Results are more immediately actionable in local situations for people and/or practitioners • Generalizable findings more credible to people, practitioners and policy makers elsewhere because they were generated in partnership with people like themselves • Helps to reframe issues from health behavior of individuals to encompass system and structural issues. Green LW, Mercer SL. Am J Public Health Dec. 2001.
Definition and Standards of Participatory Research for Health* Systematic investigation… Actively involving people in a co-learning process… For the purpose of action conducive to health** --not just involving people more intensively as subjects of research or evaluation *Green, George, Daniel, et al., Participatory Research…Ottawa: Royal Society of Canada, 1997. www.lgreen.net/guidelines.html
The Lenses of Scientists, Health Professionals and Lay People Subjective Indicators of Health Professional, Scientific Layperson Objective Indicators of Health
Issues for Evidence-Based Practice and Translating Research to Practice • Making practice more theory-based • Setting research priorities • Making research findings actionable, usable, relevant: participatory research • Translating research to local cultures & circumstances: External validity & “fidelity” vs adaptation • Making evidence more practice-based
Building Policy and Practice from Evidence + Theory • Not starting with theory and looking for problems on which to test them, but starting with problems and looking for theories to help us solve them* • Evidence on solutions generalizes to other circumstances, settings, & populations in the form of either replication or theory • Replication is limited by the infinite number of context-population combinations • "In theory, theory and practice are the same thing. In practice they're not..“ -Jan L.A. van de Snepscheut • “All models are wrong. Some are useful” --Box *Green LW. Public health asks of systems science… Amer J Public Health 96, March 2006.
“Fidelity” vs Adaptation* • Researchers test an intervention for its efficacy • Rigorous test (efficacy) qualifies it for official lists of “evidence-based practices” and guidelines • Practitioners try to incorporate it into their programs in other populations, circumstances • Poor fit produces failure of program • Practitioners are blamed for not implementing with “fidelity” • Now buy the producers’ training program * Green LW, Glasgow RE, …external validity…Evaluation & the Health Professions, Mar. 2006.
Efficacy vs. Effectiveness: • Efficacy. The tested impact of an intervention under highly controlled circumstances. • Effectiveness. The tested impact of an intervention under more normal circumstances (relativelyless controlled, real-time, “typical” setting, population, and conditions). • Broad Program Evaluation. The tested impact of a blended set of interventions on larger systems and populations. “Natural Experiments” with minimal control, maximum variability.
The Trade-offs • Efficacy. Maximizes internal validity, i.e., the degree to which one can conclude with confidence that the intervention caused the result. • Effectiveness. Maximizes external validity,* i.e., the degree to which one can generalize from the test to other times, places, or populations. • Program Evaluation. Maximizes reality testing in particular settings, & with the combination of interventions at multiple levels required for public health effect. * Green LW, Glasgow RE, …external validity…Evaluation & the Health Professions, Mar. 2006.
Issues for Evidence-Based Practice and Translating Research to Practice • Blending evidence-based practice with theory-based practice • Setting research priorities • Making research findings actionable, usable, relevant: Participatory Research • Translating research to local circumstances • Making evidence more practice-based: the centrality of evaluation and continuous quality improvement research
Mediating and Moderating Variables Mediator Intervention Outcome or Program Variable(s) Mediator Moderators Moderators Green & Kreuter, Health Program Planning: An Educational and Ecological Approach. 4th ed. New York: McGraw-Hill, 2005. Green & Glasgow, E&HP, 2006.
Aligning Evidence with (and deriving it from) Practice: Matching, Mapping, Pooling and Patching • Matching ecological levels of a system or community with evidence of efficacy for interventions at those levels • Mapping theory to the causal chain to fill gaps in the evidence for effectiveness of interventions • Pooling experience to blend interventions to fill gaps in evidence for the effectiveness of programs in similar situations • Patching pooled interventions with indigenous wisdom and professional judgment about plausible interventions to fill gaps in the program for the specific population *Green & Kreuter, Health Program Planning: An Educational and Ecological Approach. 4th ed. NY: McGraw-Hill, 2005, Chapter 5. Green & Glasgow, 2006.
3 Conceptualizations of the Gap Between Research & Practice • Practitioners need to receive the lessons of research and put them into practice. • Research and practice are entirely separate disciplines and each must develop their own answers to their own problems • Research and practice have complementary perspectives and skills that need to be used together to address the real need, collaborative knowledge production. • Add to this the need to include the patient’s perspective. Whose perspective prevails? Van De Ven A, Johnson P. Knowledge for theory and practice. Academy of Management Review. 2006;31(4).
The Bridge (not the Pipeline) from Research to Practice and Back • If we want more evidence-based practice, we need more practice-based evidence. • The importance of practitioners and policy-makers in shaping the research questions. • Practitioners and their organizations represent the structural links (and barriers) to addressing the important determinants of health behavior at each level. Engage them, not at passive recipients, but as partners… *Green, L.W. From research to “best practices” in other settings and populations. Am J Health Behavior 25:165-178, April-May 2001. Full text: www.ajhb.org/25-3.htm.
The Vision for Translation 2 A future in which we would not need to ask how to get more evidence-based practice, rather How to sustain the engagement of students, practitioners, patients and communities in a participatory process of practice-based research and program evaluation? How to adapt the “best practices” guidelines through best processes of collecting data to diagnose the biopsychosocial needs of their patients and communities…
Translation 2 Vision (expanded) How to match the proposed evidence-based interventions to those needs, filling gaps in the evidence-based interventions with the use of theory and mutual consultation, and prospective testing of complementary interventions The cumulative, building-block tradition of evidence-based medicine from RCTs would be complemented by a parallel strengthening and support of a tradition of participatory research and evaluation conducted in practice settings.
6 Conclusions (Remedies) Adapt the research funding priorities Adapt publication criteria Adapt the criteria for inclusion and weighting of studies into systematic reviews and research syntheses Adapt the derivation and qualification of practice guidelines from the systematic reviews Adapt the academic promotion and tenure criteria and weights given to community- & practice-based research Adapt the research training of students and fellows in methods of practice-based and participatory research