310 likes | 581 Views
Evaluating the rigour and validity of research – doing critical appraisal Plenary 11 Carol.Davies@warwick.ac.uk 2008. Objectives. To summarise what critical appraisal is
E N D
Evaluating the rigour and validity of research – doing critical appraisal Plenary 11 Carol.Davies@warwick.ac.uk 2008
Objectives • To summarise what critical appraisal is • To understand the evidence presented, including the relationship between study design & research findings using critical appraisal tools • Practical • Selecting the right critical appraisal tool
What is critical appraisal? • Definition • the process of systematically examining research evidence to assess its validity, results and relevance before using it to inform a decision • Critical Appraisal Skills Program, Institute Health Sciences, Oxford • Part of evidence based medicine (EBM) allowing us to make sense of research evidence to ensure practice is aligned with ‘best’ evidence • Clinical experience • Values based medicine
Advantages & disadvantages of critical appraisal • Advantages • systematic way of assessing validity, results & usefulness of research • contributes to improving practice (quality) • encourages objective assessment of information • not difficult to develop skills • Disadvantages • time consuming • not always any easy answers or what you hoped to find • dispiriting if ‘good’ evidence is lacking i.e. little / poor research done • BUT… you can all do it with the right tools & guidance
Check lists for reviewers • Very useful • maintains consistency of approach • questions asked depend on type study design • Sources: www.shef.ac.uk/scharr www.phru.nhs.uk/casp/critical_appraisal_tools.htm
Levels of evidence • Levels of evidence best to worst • Systematic review of RCTs, individual RCT 2. Systematic review of cohort studies, individual cohort 3. Systematic review of case control studies, individual case control 4. Case series (+poorly designed cohort and case control) • Expert opinion without explicit critical appraisal NB. Grades of evidence
Sampling • Whole population (blue) • Research sample of whole population (pink)
5 0 How many types of statistical errors are possible analysing study data? • One • Two • Three • Four
major errors about effectiveness of interventions are…. • errors are…..
5 0 What kind of errors can’t study designs overcome? • Poor sampling • Wrong user calculations • Random • Lack of data
Reading any paper: Basic screening questions • Was it a clearly focused question? PICO • population • intervention • comparison • Outcomes - avoid “effective/effectiveness or improve” • Was it the right type of study design to address the question?
A good question? • Question:Does the evidence support the use of ACE inhibitors to reduce the decline in renal function for patients with non-diabetic kidney disease? • How could wording of question be improved? • Ideally, what kind of studies would we look for to answer the question?
Continuing to read the paper • Subject numbers, assignment & follow-up • Groups comparable? Treated equally? • Blinding? Why might this not happen? • Interpretation of findings • size of effect; Why does this matter? • precision of estimate of treatment effect (CI) • GeneralisabilityWhat does this mean?
0 of 5 Answer Now Why do we group subjects into ‘intention to treat’ categories for analysis? • It’s more precise • It’s convention • It’s more representative of real life • It’s just a trick – lies, more lies & statistics
Critical appraisal • What are we looking for in particular study designs?
CA of Systematic review • Assessing quality of question posed still matters Why? • Did authors look for right kind of studies/papers? What does this mean?
0 of 5 Why does looking at all appropriate sources of evidence matter? Select all that apply • Miss important papers • If it’s a systematic review need all the evidence • If conclusions are to be useful to others • To be up to date
CA of Randomised controlled trials Data collection same in all arms? Why? • Deviation from planned treatment Why does this matter? • Clinical impact/significance What supports this?
0 of 5 Answer Now What main factor ensures results are applicable locally? • Similar service configuration • Similar local population • Study carried out in UK • Other similar research studies
CA of Guidelines • Were important recent developments/research included? Why is this important? • Peer review Why?
Example: CA of Guidelines • Student question. How effective is implementing NICE guideline to determine appropriate recall intervals between routine dental examinations in NHS contract in improving outcomes (oral health & cost effective oral health) for patients, dentists, taxpayers and PCTs compared to 6/12 recall interval?
Summary • Relationship between PICO & study design is important • Assessing quality of studies & justification of outcomes is cornerstone of critical appraisal • Use of checklists provides consistency of approach & ensures relevant questions are asked. • Different study designs need different questions to be asked so we use different critical appraisal checklists
5 0 Answer Now Selecting the right CA tool No 1 • Systematic review • RCT • Cohort • Case control/case series • Guideline • Qualitative study
0 of 5 Answer Now Selecting the right CA tool No 2 • Systematic review • Cohort • Case control/case series • Guideline • Qualitative study
5 0 Answer Now Select the right CA tool No 3 • Systematic review • RCT • Cohort • Case control/case series • Guideline • Qualitative study
0 of 5 Answer Now Selecting the right CA tool No 4 • Systematic review • RCT • Cohort • Case series/case control • Guideline • Qualitative study
0 of 5 Thank you for your input!