1 / 57

The logic and practice of research synthesis' — 8 th July 2005 —

Methods for Research Synthesis. The logic and practice of research synthesis' — 8 th July 2005 —. David Gough EPPI-Centre, Social Science Research Unit, Institute of Education, University of London. Today’s seminar. Welcome and introduction Purpose, principles and potential

millie
Download Presentation

The logic and practice of research synthesis' — 8 th July 2005 —

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Methods for Research Synthesis The logic and practice of research synthesis' — 8th July 2005 — David Gough EPPI-Centre, Social Science Research Unit, Institute of Education, University of London

  2. Today’s seminar • Welcome and introduction • Purpose, principles and potential • Small group work: exploring selected review tools and procedures • Ways forward

  3. Aims of this seminar • have a critical understanding of the value of systematic methods for synthesizing research evidence; • have an understanding of the importance of reviews being question-led; • have discussed the diversity of approaches to synthesis and identified principles and decision points central to all; • considered the potential for systematic approaches to your own review work.

  4. Social Science Research Unit Five streams of work: • Childhood Studies • Evaluation of Social Interventions • Sexual Health, Reproduction and Social Exclusion • Evidence for Policy and Practice Information and Co-ordinating Centre • Perspectives, Participation and Research http://www.ioe.ac.uk/ssru/

  5. EPPI-Centre personnel David Gough (Director), Ann Oakley (Founding Director) Angela Harden and Sandy Oliver (Co-directors HP & PH) Jo Garcia (User involvement, Education) Jon Ashton Mukdarut Bangpan Jackie Barry Ginny Brunton Jeff Brunton Helen Burchett Esther Coren Kelly Dickson Adam Fletcher Zoe Garrett Nicholas Houghton Josephine Kavanagh Eva Moran Mark Newman Chloe Powell Rebecca Rees Abigail Rowe Katy Sutcliffe James Thomas Carole Torgerson Alex Trouton Helen Tucker Lisa Underwood

  6. EPPI-Centre vision and mission statement Developing and promoting participatory and user-friendly systematic reviews addressing important questions in different domains of policy, practice and research in the public interest

  7. EPPI-Centre external Review Groups (RGs) and teams • 16 funded by the Department for Education and Skills (DfES) on different aspects of teaching and learning (some with joint funding – e.g. Nuffield Foundation, NUT (HEFCE), ESRC TLRP) • 8 funded by the Teacher Training Agency on different aspects of teacher education • 1 on transitions of students with disabilities from school to work funded by (based in Colorado in the US) • 1 in higher education funded by the Learning Skills Development Agency • New collaborations with SCIE and DWP

  8. EPPI-Centre ‘in-house’ reviews • The effect of travel modes on children’s mental health, cognitive, social development (DETR) • The effects of personal development profiling for improving student learning (LTSN) • Support for pupils with emotional and behavioural difficulties (NFER) • Effect of secondary school size on outcomes (DfES) • Series of reviews on health promotion and public health (HP & PH) topics funded primarily by the Department of Health (DH) (see next slide)

  9. Health promotion & public health review series 1997–1999 Workplace health promotion Peer-delivered health promotion for young people 1993–1996 Young people and smoking Older people and accidents Young people and sexual health Men who have sex with men (MSM) and sexual health 1999–2001 Young people and: …mental health …physical activity …healthy eating 2004–2007 Incentives for young people Active transport Teenage pregnancy and parenting Risk behaviour and accidental injury 2001–2004 Children and: …physical activity …healthy eating HIV-health promotion for MSM

  10. Purpose, principles and potential • Why use systematic approaches to reviewing? • What are the key principles of systematic research synthesis (SRS)?

  11. Imaginary scenario • You’re in a PTA AGM. The head teacher is speaking about homework. In the room are teachers, parents (one of whom is also a researcher) and school governors. • Someone says “I’ve read a literature review that says homework is bad for children”. • What might it be useful to know about this review? • Nature of variation between summaries of research

  12. Key features of SRS • Explicit research question • Explicit and transparent method: protocol up front • A piece of research: methods to avoid bias • Explicit reporting: accountable, replicable, updateable • Judgement of study quality and relevance: for question • Relevant: need for involvement of research users

  13. Variation between reviews 6 reviews of older people and accident prevention Total studies included 137 Common to at least two reviews 33 Common to all six reviews 2 Treated consistently in all reviews 1 From Oliver S, Peersman G, Harden A and Oakley A (1999) Discrepancies in findings from effectiveness reviews: the case of health promotion for older people in accident and injury prevention. Health Ed J, 58:66-77.

  14. How some findings evade us • Profusion of published and unpublished material • Much hidden: only 50% abstracts presented at conferences are later published in full • Constraints of own expertise and disciplinary affiliations

  15. Some evasion is systematic • Some types of studies are harder to find than others* • E.g. statistically significant, ‘positive’ results more likely to be: • Published, published rapidly • Published in English • Published more than once • Cited by others • *work cited in Egger M et al (2003) . How important are comprehensive literature searches and the assessment of trial quality in systematic reviews? Empirical study. Health Technology Assessment 2003; Vol. 7(1)

  16. Are systematic searches important? Some, limited evidence from health about how this affects estimates of effect: • Unpublished trials show less beneficial effects than published trials • Non-English language trials and non-Medline indexed trials tend to show larger treatment effects • Trials that are difficult to locate tend to be smaller and of lower methodological quality *Egger et al (2003) op cit.

  17. Different searches find different studies (1) How are decisions made about the entry of people aged 65+ to care services? Taylor B, Dempster M and Donnelly M (2003) Hidden Gems: systematically Searching Electronic Databases for Research Publications for Social Work and Social Care. C J Social Work, 33:423-429.

  18. Different searches find different studies (2) *Adapted from: Harden A, Peersman G, Oliver S, Oakley A (1999) Identifying primary research on electronic databases to inform decision-making in health promotion: the case of sexual health promotion. Health Education Journal 58: 290–301.

  19. A matter of degree? Discuss.. Table 1. Systematic and traditional reviews compared Pettigrew, M. (2001) Systematic reviews BMJ, 322,98-101

  20. What is a systematic review? A systematic review develops an evidence-based statement in a particular area of interest. It is a piece of research in its own right using explicit and transparent methods following a standard set of procedures. This means it can be replicated. It judges the quality and relevance of evidence for a given question.

  21. Research evidence for policy and practice • “…… policy makers and practitioners who intervene in the lives of other people not infrequently do more harm than good” Chalmers I (2003) Trying to do more good than harm in policy and practice: the role of rigorous, transparent, up to date, replicable evaluation. Paper commissioned for the Annals of the American Academy of Political and Social Science.

  22. ‘Reduce the Risk’ Campaign in the early 1990s in the UK “The risk of cot death is reduced if babies are NOT put on the tummy to sleep. Place your baby on the back to sleep…. …..Healthy babies placed on their backs are not more likely to choke.”

  23. Research for practice “Teaching is not at present a research-based profession” Hargreaves D (1996) Teaching as a research-based profession: possibilities and prospects. Teacher Training Agency (TTA) Annual Lecture. London: TTA. Procedural (craft / apprentice) Vs. declarative (for e.g. research based) knowledge

  24. Access to research “..(which is) presented in a form or medium which is largely inaccessible to a non-academic audience; and lack(s) interpretation for a policy-making or practitioner audience”. Hillage J, Pearson R, Anderson A, Tamkin P (1998) Excellence in Research in Schools. London: Department for Education and Employment/Institute of Employment Studies.

  25. Public debate “We are, through the media, as ordinary citizens, confronted daily with controversy and debate across a whole spectrum of public policy issues. But typically, we have no access to any form of a systematic ‘evidence base’ - and therefore no means of participating in the debate in a mature and informed manner”. AFM Smith (1996) Mad cows and ecstasy: chance and choice in an evidence-based society. Journal of the Royal Statistical Society 159(3): 367-383.

  26. Facilitating the use of research for policy and practice By preparing systematic reviews of the results of relevant, reliable researchAdditional research, IF systematic reviews of existing research show that this is needed

  27. Key stages and variation 3 • What key stages do the different models of systematic research synthesis (SRS) have in common? • How do the different models differ? • In what ways do review questions influence review methods?

  28. Key decision-making stages in SRS Form review team Formulate review question and develop protocol Search for and screen studies (search strategy) Describe studies (systematic map of research) Assess study quality (and relevance) Synthesise findings (answering review question) Communicate and engage

  29. Reviewing as a team activity • Using more than one reviewer at key decision-making stages • Independent work, followed by consensus • Quality assurance of samples • Research user participation • In setting question to increase relevance • Could mean service users, practitioners, policy makers…

  30. Formulating the review question • Provides framework for all other stages of review • Specify/clarify conceptual framework • Type of question ? • Study types ? • Settings / populations / phenomena ? • Issues to consider • Why is the review needed ? • Who will use the review ?

  31. Developing the review protocol • Why do we want one? • Protocol contains • Review question • Definitions of key concepts • Explicit inclusion criteria • Details of review methods for each stage • Details of who’s involved/ how to participate • Guidance on developing protocols is available • EPPI-Centre, Cochrane, Campbell

  32. What is a systematic search? • Thoughtfully planned • Carefully executed • Accurately recorded • Made explicit • Open to scrutiny • Can be replicated and extended

  33. Systematic and comprehensive searching • Based on conceptual framework and resultant inclusion and exclusion criteria • Techniques of searching and retrieval aim to maximise the yield of relevant studies while minimizing the number not relevant to the review • Searching and screening stages are generally kept separate • Both stages are documented

  34. Sensitivity and specificity • Sensitivity • Expresses ability to locate all studies of interest • Calculated as the number of relevant studies located as a proportion of all that exist • Low sensitivity means that many relevant studies missed by search • Specificity • Expresses the accuracy of the search strategy in identifying studies of interest • Calculated as the proportion of the total number of studies identified by search which are deemed ‘relevant’ • Low specificity means that the search identified many studies that were not relevant to the review. TRADE OFF BETWEEN SENSITIVITY AND SPECIFICITY

  35. Screening then Describing studies • Screen to check that studies found through search strategy do meet the inclusions criteria • Describe the studies = systematic map

  36. Systematic maps and systematic synthesis • Map: What has been done? • maps out research activity (for e.g. broader question with multiple designs) • provides context for synthesis • research designs in primary studies part of that context • Allows potential for narrowing question (inclusion criteria) • Synthesis: What is known from what has been done?

  37. Quality assessment of studies • To make (and record) judgements about study quality • To inform the synthesis • Only use findings from studies judged to be of high quality to inform recommendations for policy or practice • Qualify findings (‘quality framework’) • To examine differences in findings according to quality

  38. Methods for synthesis Tend to build on techniques used in primary research • Statistical meta-analysis • E.g. numerical effect sizes are combined to create a pooled and weighted overall effect size • Narrative ‘empirical’ • Conceptual framework used to construct and organise a narrative to combine the evidence from each study (evidence could be numerical or textual) • Conceptual • E.g. Qualitative techniques used to organise and combine conceptual or textual data

  39. Dimensions of difference in synthesis models • Review/research questions (impact, process, need) • Research designs considered relevant • Types of data - numerical or textual • Quality assessment of different designs • Breadth of designs and study focus • Variation in contribution of each study to the systematic synthesis

  40. Review research questions • What are the questions that users want answered? • What do people want? (Needs assessment) • What works? (Impact/effectiveness) • Why/how does it it work? (Process/explanation) • What is happening? (Implementation levels) • Others…? • Different research designs best for addressing different questions (fit for purpose)

  41. Organisations supporting systematic reviewing • The Cochrane Collaboration – healthcare interventions www.cochrane.org • The Campbell Collaboration – interventions in social work, criminal justice and education www.campbellcollaboration.org • The EPPI-Centre – full range of empirical research in education (all phases), health promotion, social care and employment www.eppi.ioe.ac.uk

  42. SYNTHESIS 1 Quantitative (Trials) Provision of pre- and post- data on outcomes Provision of data on all outcomes measured Employment of equivalent control/comparison group Resulted in ‘high’, ‘medium’ and ‘not sound’/ ‘low’ trials SYNTHESIS 2 Qualitative (‘Views’) Quality of reporting (5 items) Sufficiency of strategies for reliability/validity (4 items) Extent to which study findings were rooted in children’s own perspectives (3 items) Quality-assessment methods

  43. Ways forward 5 • What else do you want to know about the principles and methods discussed today? • If planning to use some SRS methods, what issues (drivers, resources, methodology) might be important?

  44. Scoping, mapping and reviewing in-depth • Can select from these approaches for a systematic approach to the following kinds of review: • What is the size of the literature? • What kinds of literature exist? • What can this literature tell us reliably?

  45. Examples of other systematic review sources • The ESRC UK Centre for Evidence Based Policy and Practice Evidence Network • http://www.evidencenetwork.org/ • The NHS Centre for Reviews and Dissemination, University of York: Database of Abstracts of Effectiveness (DARE) • http://agatha.york.ac.uk/ • Bibliographic databases (try combining your topic with ‘meta-analysis’ or ‘systematic review’ or ‘research synthesis’)

  46. EPPI-Centre Dissemination • Web site for all EPPI-Centre reviews: http://eppi.ioe.ac.uk/ • Research Evidence in Education Library (REEL) • Web-page accessibility • Review Group details • Protocols • Reviews and user perspectives • Primary data underlying reviews • Bibliographic information • Detailed descriptions and quality assessment

More Related