210 likes | 395 Views
27 June 2005. Towards Systematic Reviews That Inform Healthcare Management and Policymaking. AcademyHealth Annual Research Meeting Boston, MA, USA. John N. Lavis, MD, PhD Associate Professor and Canada Research Chair in Knowledge Transfer and Uptake McMaster University. Acknowledgements.
E N D
27 June 2005 Towards Systematic Reviews That InformHealthcare Management and Policymaking AcademyHealth Annual Research Meeting Boston, MA, USA John N. Lavis, MD, PhD Associate Professor and Canada Research Chair in Knowledge Transfer and Uptake McMaster University
Acknowledgements • Co-investigators • Huw Davies, University of St. Andrews • Andy Oxman, Norwegian HSR Centre • Jean-Louis Denis, Université de Montréal • Karen Golden-Biddle, University of Alberta • Ewan Ferlie, University of London • Funders • Canadian Health Services Research Foundation • NHS Service & Delivery Organization R&D Program
Overview • Background • Research objective • Study design and population studied • Principal findings • Conclusions and implications
Background • Healthcare managers and policymakers face lots of questions that can be answered in part by research evidence • Finding effective (and cost-effective) solutions to the most burdensome health problems • Fitting these solutions into health systems (i.e., governance, financial, and delivery arrangements) • Bringing about change in health systems
Background (2) • Systematic reviews of research evidence • Reduce the likelihood that managers and policymakers will be misled by research (by being more systematic and transparent in the identification, selection, appraisal and synthesis of studies) • Increase confidence among managers and policymakers about what can be expected from an intervention (by increasing the number of units for study)
Background (3) • Systematic reviews of research evidence (2) • Allow managers, civil servants and political staff to focus on appraising the local applicability of systematic reviews and on collecting and synthesizing other types of evidence, such as evidence about political acceptability and feasibility – i.e., allow them to focus on the apex of the research knowledge pyramid while doing the rest of their jobs • Allow for more constructive contestation of research evidence by stakeholders
Background (4) • Actionable messages • Systematic reviews of research • Individual studies, articles, and reports • Basic, theoretical and methodological innovations
Research Objective • To identify ways to improve the usefulness of systematic reviews for healthcare managers and policymakers that could then be evaluated prospectively, which we identified by exploring: • Nature of decision-making and approach to research evidence • Types of questions asked • How research evidence is assessed • How much value is placed on recommendations • Optimal presentation of research evidence
Study Design and Population Studied • Study design • Systematic review of studies of decision-making by healthcare managers and policymakers • Interviews with a purposive sample of healthcare managers and policymakers in Canada and the United Kingdom (N=29) • Websites of research funders, producers/purveyors of research, and journals that include healthcare managers and policymakers among their target audiences (N=45)
Study Design and Population Studied • Population studied • Purposive sample of healthcare managers (or the senior staff of associations that seek to inform managers) in Ontario and England and healthcare policymakers in the Canadian federal and Ontario provincial governments and the United Kingdom government • Study participants were almost always drawn from the top ranks of their respective organizations (in the case of healthcare managers), department (in the case of civil servants) or office (in the case of political advisors)
Principal Findings • Systematic review • Individual-level interactions between researchers and healthcare policymakers increased the prospects for research use • Timing and timeliness increased (and poor timing or lack of timeliness decreased) the prospects for research use • Individuals’ negative attitudes towards research evidence decreased the prospects for research • Individuals’ lack of skills and expertise decreased the prospects for research use
Principal Findings (2) • Interviews • Most do not highly value systematic reviews as an information source • Many have used systematic reviews to address many different types of questions • Some identified that they would benefit from having contextual factors highlighted in order to inform assessments of a review’s local applicability • All would value information about the benefits, harms (or risks), and costs of interventions, the uncertainty associated with estimates, and variation in estimates by subgroup
Principal Findings (3) • Interviews (2) • Disagree about whether researchers should provide recommendations • Almost all would value reports presented using something like a 1:3:25 format • Some identified that they would value systematic reviews being made more readily available for retrieval when they are needed
Principal Findings (4) • Website review • Attributes of the context in which the research was conducted were rarely provided • Recommendations were often provided • Reports using a graded-entry format (e.g., 1:3:25) were rare
Conclusions and Implications • Provisional answers to question 1 lead us to argue for • Thinking broadly about healthcare managers and policymakers as target audiences • Demonstrating to them the value of systematic reviews • Engaging them in the production and adaptation of systematic reviews • Building their capacity to identify quality-appraised sources of systematic reviews and to appraise their local applicability
Conclusions and Implications (2) • Provisional answers to question 2 lead us to argue for • Producing reviews that address a broad array of questions • Provisional answers to question 3 lead us to argue for • Making available an online source of all types of quality-appraised reviews • Identifying the benefits, harms (or risks) and costs of interventions, highlighting uncertainty, and describing any differential effects by sub-group • Identifying contextual factors that may affect assessments of local applicability
Conclusions and Implications (3) • Provisional answers to question 4 lead us to argue for • Not providing recommendations • Avoiding the use of jargon • Provisional answers to question 5 lead us to argue for • Producing user-friendly “front ends” for reviews (e.g., one page of take-home messages and a three-page executive summary) to facilitate rapid assessments of the relevance of a review and, when the review is deemed highly relevant, more graded entry into the full details of the review
Conclusions and Implications (4) • Researchers could make three changes to how they produce and update systematic reviews • Involve healthcare managers and policymakers in posing questions, reviewing approach, and interpreting results • For systematic reviews about “what works,” identify the benefits and harms (or risks) of interventions, highlight uncertainty, and describe any differential effects by sub-group • Identify contextual factors that may affect assessments of local applicability
Conclusions and Implications (5) • Research funders could support three types of local adaptation processes • Develop more user-friendly “front ends” for reviews • Add additional local value to systematic reviews about “what works” by describing the benefits, harms (or risks) and costs that can be reasonably expected locally and to any type of systematic review by using language that is locally applicable • Make user-friendly “front ends” of systematic reviews available through an online database that can be linked to the full reviews through other sources, such as The Cochrane Library
References • Lavis JN, Davies HTO, Oxman A, Denis J-L, Golden-Biddle K, Ferlie E. Towards systematic reviews that inform healthcare management and policymaking. Journal of Health Services Research and Policy; in press • Lavis JN, Becerra Posada F, Haines A, Osei E. Use of research to inform public policymaking. The Lancet 2004; 364:1615-1621
Contact Information • John N. Lavis • lavisj@mcmaster.ca • Program in Policy Decision-Making, McMaster University • www.researchtopolicy.ca