910 likes | 1.1k Views
Promoting Science-based Approaches: Bridging Research and Practice by Integrating Research to Practice Models and Community-Centered Models (ISF) Abraham Wandersman wandersman@sc.edu U. Of Connecticut April 2010. MAKING A DIFFERENCE. MAKING A DIFFERENCE. HOW DO WE GET THERE?.
E N D
Promoting Science-based Approaches: Bridging Research and Practice by Integrating Research to Practice Models and Community-Centered Models (ISF)Abraham Wandersmanwandersman@sc.eduU. Of ConnecticutApril 2010
MAKING A DIFFERENCE • HOW DO WE GET THERE?
THE 2015 TARGET DATE FOR ELIMINATING SUFFERING AND DEATH DUE TO CANCER:
Dr. von Eschenbach: I believe we are at what I call a strategic inflection in biology, which means we're at a point of unprecedented growth in three key areas related to cancer research: knowledge, technology, and resources. The integration of growth in these three sectors provides an opportunity for exponential progress. To achieve this progress, we must set a clear direction and focus our efforts into a cohesive strategy.
The goal of eliminating suffering and death due to cancer provides this focus. It does not mean "curing" cancer but, rather, it means that we will eliminate many cancers and control the others, so that people can live with -- not die from -- cancer. We can do this by 2015, but we must reach for it. We owe it to cancer patients around the world -- and their families -- to meet this challenge. May 16, 2003 BenchMarks
Healthy People 2010 Objectives • Target: 1.0 new case per 100,000 persons. • Baseline: 19.5 cases of AIDS per 100,000 persons aged 13 years and older in 1998. Data are estimated; adjusted for delays in reporting. • Target setting method: Better than the best. • Data source: HIV/AIDS Surveillance System, CDC, NCHSTP.
In 2007, there were 42,495 new cases of HIV/AIDS in adults, adolescents, (2500)
Expanding Research and Evaluation Designs…for QII Carolyn M. Clancy, MD Director, AHRQ September 13, 2005
Original research 18% variable Negative results Dickersin, 1987 Submission 46% 0.5 year Kumar, 1992 Koren, 1989 Acceptance Negative results 0.6 year Kumar, 1992 Publication 17:14 35% 0.3 year Poyer, 1982 Balas, 1995 Lack of numbers Bibliographic databases Expert opinion 50% 6. 0 - 13.0 years Antman, 1992 Poynard, 1985 Reviews, guidelines, textbook 9.3 years Inconsistent indexing Implementation It takes 17 years to turn 14 per cent of original research to the benefit of patient care
Treatments Thought to Work but Shown Ineffective • Sulphuric acid for scurvy • Leeches for almost anything • Insulin for schizophrenia • Vitamin K for myocardial infarction • HRT to prevent cardiovascular disease • Flecainide for ventricular tachycardia • Routine blood tests prior to surgery • ABMT for late stage Breast CA BMJ February 28 2004; 324:474-5.
THE GAP BETWEEN SCIENCE AND PRACTICE • IN THE DOCTOR’S OFFICE
OVERALL 54.9% RECEIVED RECOMMENDED CARE ASCH ET AL STUDY, NEJM, 2006
POSSIBLE SOLUTION • VA MEDICAL SYSTEM HAS 67% RECOMMENDED CARE SYSTEM HAS ELECTRONIC MEDICAL RECORDS, DECISION SUPPORT TOOLS, AUTOMATED ORDER ENTRY, ROUTINE MEASUREMENT AND REPORTING ON QUALITY, INCENTIVES FOR PERFORMANCE
As Yogi Berra supposedly said, "In theory there is no difference between theory and practice, but in practice there is."
* What is the dominant scientific paradigm for developing research evidence and disseminating it?
* What is the responsibility of the practitioner to deliver evidence-based interventions and what is their capacity to do so?
* What is the responsibility of funders to promote the science of evidence-based interventions and to promote the practice of effective interventions in our communities?
How can evaluation help providers, local CBOS and coalitions, health districts, and state agencies reach results-based accountability?
Two Routes to Getting To Outcomes (GTO): A) Bridging Science and PracticeB) Empowerment Evaluation
Research To Practice Practice To Research CLOSING THE GREAT DIVIDE
Feedback Loop • Identity problem or disorder(s) and review information to determine its extent 2. With an emphasis on risk and protective factors, review relevant infor-mation—both from fields outside prevention and from existing preventive intervention research programs • Design, conduct, and analyze pilot studies and confirmatory and replication trials of the preventive intervention program • Design, conduct, and analyze large-scale trails of the preventive intervention program • Facilitate large-scale implementation and ongoing evaluation of the preventive intervention program in the community FIGURE 1.1 The preventive intervention research cycle. Preventive intervention research is represented in boxes three and four. Notre that although information from many different fields in health research, represented in the first and second boxes, is necessary to the cycle depicted here, it is the review of this information, rather than the original studies, that is considered to be part of the preventive intervention research cycle. Likewise, for the fifth box, it is the facilitation by the investigator of the shift from research project to community service program with ongoing evaluation, rather than the service program itself, that is part of the preventive intervention research cycle. Although only one feedback loop is represented here, the exchange of knowledge among researchers and between researchers and community practitioners occurs throughout the cycle.
Gates Foundation Preventive Intervention Vaccine/Drug Mechanism Syringes Physician Health System Support System Medical Schools Government Funding
From Research to “Best Practices” in Other Settings and PopulationsLarry GreenAmerican Journal of Health Behavior, 2001 • Process • Control • Self-Evaluation • Tailoring Process and New Technology • Synthesizing Research
Getting to Outcomes 1) Needs/Resources 2) Goals 3) Best Practice 4) Fit 5) Capacities 6) Plan 7) Process Evaluation 8) Outcome Evaluation 9) CQI 10) Sustain Prevention Science
“Prevention Science” Intervention Basic research Efficacy Effectiveness Services Research Prevention Support System (Funders) Training Technical Assistance Funding Practice Community Organizational Systems 1) Schools 2) Health Agencies 3) Community Coalitions • Green Characteristics • Process • Control • Self-Evaluation • Tailoring Process and new Technology • 5) Synthesizing Research
io Funding Putting It Into Practice—Prevention Delivery System General Capacity Use Innovation-Specific Capacity Use Supporting the Work—Prevention Support System Macro Policy Climate General Capacity Building Innovation-Specific Capacity Building Distilling the Information—Prevention Synthesis & Translation System Synthesis Translation Existing Researchand Theory
ROUTE B: EMPOWERMENT EVALUATION
Lead Agency Forms Ad Hoc Committee Of Community Leaders Committees Forms COALITION { Criminal Justice Grassroots/ Neighborhood Business Education Religion Media Parents Youth Health FORMATION { Conduct Needs Assessment Chairpersons ConsolidateWork of Individual Committees Implementation { MAINTENANCE Resulting In ComprehensiveCommunity Plan Resulting In Plan Implementation { Resulting In OUTCOMES Impact on Community Health Indicators Figure 2. Overview of the development of a community coalition.
Table 1. Evaluation of MPA by Developmental Phases, Ecological Levels, and Stages of Readiness
Shoot Aim Ready Implement Plan No Results
Ready Aim Shoot Close Plan Implement CQI
Ready Aim Shoot Hit Plan Implement Results
Empowerment Evaluation: An evaluation approach that aims to increase the probability of achieving program success by:
Providing program stakeholders with tools for assessing the planning, implementation, and self-evaluation of their program, and
Mainstreaming evaluation as part of the planning and management of the program/organization.