1 / 31

Quality appraisal of qualitative research

Quality appraisal of qualitative research . Karin Hannes Centre for Methodology of Educational Research . Agenda: Quality appraisal. What is it ? Should we appraise quality ? Are there any criteria that compare to the basic quality criteria for quantitative studies?

yamka
Download Presentation

Quality appraisal of qualitative research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Qualityappraisal of qualitative research Karin HannesCentre forMethodology of Educational Research

  2. Agenda: Quality appraisal • What is it? • Should we appraisequality? • Are thereany criteria thatcompare to the basicquality criteria forquantitative studies? • Are thereanytechniques to diminishqualitythreats? • What are the different stages in a criticalappraisalexercise? • How do different appraisalinstrumentscompare to eachother? • How to use and report a criticalappraisaloutcome?

  3. Quality appraisal: what is it! • “the process of systematically examining research evidence to assess its validity, results and relevance before using it to inform a decision” • Hill, A., Spittlehouse, C, 2003. What is critical appraisal? Evidence Based Medicine, 3 (2), 1-8. Available from: • http://www.evidence-based-medicine.co.uk/ebmfiles/WhatisCriticalAppraisal.pdf

  4. Quality appraisal: Should we?A review of published QES • 21 papers didnotdescribeappraisal of candidate studies • 6 explicitelymentionednotconductingformalappraisal of studies • 5 papers did a criticalappraisal, butdidnotuse a formal checklist • 7 describedmodifyingexistinginstruments • 1 usedanexisting instrument without modification Dixon-Woods M, Booth A, Sutton AJ. Synthesizingqualitative research: a review of published reports. QualRes 2007; 7:375

  5. Qualitative research is subject to the same criteria as quantitative research. Validity, reliability and generalisibilityshouldbeaddressed in criticalappraisal. Quality appraisal: Should we?

  6. Quality appraisal: Should we? • Adjust the tools. • Qualitativeresearch is in needof a set of criteriaspecificallydesignedfor it.

  7. Put an end to criteriology. Do nottry to fit somethingthat in the end stiffles the creativeaspects of qualitative research. Quality appraisal: Should we?

  8. Use criteria as guides to goodpractice Theyshouldnotbeused as rigidrequirements in appraising papers. Quality appraisal: Should we?

  9. Quality appraisal: Should we? • Interpretivists (idealism) • Bias: Subjectivity is uses actively/creatively through the research process • Validity: there are multiple ways of understanding reality • Reliability: Researchers report information and readers discern the patterns identified and verify interpretations The more you appraise, the more it stifles creativity. The main inclusion criterion is relevance! Realists-pragmatics • Bias: Researcher bias affectstrustworthiness, orvalidity • Validity:the emphasis is onstrivingfortruth in being adequate, accurate, credible • Reliability: Steps to establishitshouldbebuildinto the research process to affirmresearchers’ observations The more you appraise, the lesser the chance to end up with flawed results. The main inclusion criterion is quality!

  10. Quality appraisal: Should we? I appraise! Note: I might substantially have been brainwashed in the ‘risk of bias’ discourse, beyond my personal control.

  11. Quality appraisal: Basic criteria • Carrying out ethical research • Importance of the research • Clarity and coherence of the report • Use of appropriate and rigorous methods • Importance of reflexivity or attending to researcher bias • Importance of establishing validity or credibility • Importance of verification or reliability Divergent perspectives, linked to research paradigms Cohen DJ, Crabtree BF. Evaluative criteria forqualitative research in health care: controversies and recommendations. Annals of Fam. Med. 2008; 6(4):Jul/aug

  12. Quality appraisal: Basic criteria

  13. Quality appraisal: Basic criteria

  14. Quality appraisal: different stages • Critical appraisal involves • (i) filtering against minimum criteria, involving adequacy of reporting detail • Limit the type of qualitative studies to be included to empirical studies with a description of the sample strategy, data collection procedures and the type of data-analysis considered. • Exclude: descriptive papers, editorials, opinion papers • (ii) technical rigour of the study elements indicating methodological soundness • (iii) paradigmatic sufficiency, referring to researchers’ responsiveness to data and theoretical consistency’

  15. Technical appraisal stage • Use an appraisal instrument to look for indications in a study that add to the level of methodological soundness of the study to determine the degree of confidence in the researcher’s competence to conduct research following established norms. • Needs a general understanding of qualitative criteria THE CHECKLIST APPROACH

  16. Theoretical appraisal stage • Use a subsequent paradigmatic approach to judgement, which refers to an evaluation of methodological coherence between theory and methodology / methods, to evaluate the quality and rationale of the decisions made. • Needs a more in-depth understanding of qualitative research THE OVERALL JUDGEMENT APPROACH

  17. Validity in Qualitative Research: a comparative analysis of 3 online appraisal instruments’ ability to evaluate validityHannes, Lockwood & Pearson (2010), Qualitative Health Research Which criteria are used? Focus on validity (Maxwell, 1992)? What is the extent to which appraisal instruments evaluate validity?

  18. Which criteria are used to evaluate the quality of a study? • Selection of appraisal instruments: • Used in recently published QES (2005-2008) • Online available and ready to use • Broadly applicable to different qualitative research designs • Developed and supported by an organisation/institute/consortium • Three instruments fit the inclusion criteria: • Joanna Briggs Institute-Tool • Critical Appraisal Skills Programme-Tool • Evaluation Tool for Qualitative Studies • To facilitate comparison: • Criteria grouped under 11 headings • Cross-comparison of the criteria

  19. Which criteria are used to evaluate the quality of a study?

  20. Validity as the main criterion • In evaluating methodological quality we need to know • whether the set of arguments or the conclusion derived from a study necessarily follows from the premises. • whether it is well grounded in logic or truth. • whether it accurately reflects the concepts, the ideas that it is intended to measure. • Main focus should be VALIDITY • Main question should be:What is the extent to which the different instruments establish validity?

  21. Validity as the main criterion

  22. What is the extent to which the different instruments establish validity?

  23. What is the extent to which the different instruments establish validity? • The most commonly used instrument ‘CASP’, is the least sensitive to aspects of validity. It does not address interpretive nor theoretical validity or context as a criterion. • Statements that have no clear link to excerpts are at risk of not being grounded in the data. • The theoretical position and the role of the researcher have a direct impact on the interpretation of the findings. • We need to select our critical appraisal instrument with care.

  24. What is the extent to which the different instruments establish validity? Critical note: • Checklists only capture what has been reported. • In evaluating validity at the end of a study (post hoc), rather than focusing on processes of verification during the study we run the risk of missing serious threats to validity until it is too late to correct them. • Basic qualitative researchers • should be motivated to adopt techniques that improve validity • Should be guided in how to report qualitative research in order to facilitate critical appraisal

  25. Quality appraisal: How to use and reporta critical appraisal outcome? • To include or exclude a study • *Authors may choose to give more weight to certain criteria and use this in their final judgment. • ** H/L= High/Low • *** For studies that are clearly on the verge between in- and exclusion a judgement should be made and discussed with potential co-reviewers. • **** Authors should include a motivation for exclusion for those cases where judgments are being made. Only high quality studies are included. The potential risk is that valuable insights are excluded from the synthesis.

  26. Quality appraisal: How to use and reporta critical appraisal outcome? • To give more weight to studies that scored high on quality • All valuable insights remain included. • It might be complex to report on the findings of the synthesis given the ‘subgroups’ of studies. • No fixed parameters currently exist to determine the weight of qualitative studies. • Reviewers choosing this approach need to evaluate which methodological flaws have a substantial impact on the findings presented.

  27. Quality appraisal: How to use and reporta critical appraisal outcome? To describe what has been observed without excluding any studies • All potential valuable insights remain included, because the worth of individual studies might only become recognisable at the point of synthesis rather than in the phase of appraisal.

  28. Quality appraisal: How to use and reporta critical appraisal outcome? • There is value in all of these approaches. However, in line with current Cochrane and Campbell policy, I recommend the two first approaches emphasizing the methodological soundness of studies rather than their contribution to science in general. • Guidelines: • Reviewers need to clarify how the outcome of their critical appraisal exercise is used with respect to the presentation of their findings.’ • Both recommended approaches could benefit from a sensitivity analysis evaluating what happens to the findings of a study when low or high quality studies are removed. • The convention of using at least two researchers for the quality assessment process is a useful legacy from quantitative-based review processes; not so much for inter-rater consistency purposes but, at the very least, to open up the data to a broader range of possible interpretations.’

  29. Quality appraisal: How to use and reporta critical appraisal outcome? • Critical note: • Fatalflawsmaynotbeeasilydistinguishedintosimplebinaryjudgements: itmaybenecessary to take a holistic view of a studythatrecognises the importance of context and what was feasible in that context. • The skill in critical appraisal lies not in identifying problems, but in identifying errors that are large enough to affect how the result of the study should be interpreted. • We need to balanceassessmentagainst the weight of a message: ‘signal to noise ratio’. Booth A. Cochrane orcock-eyed? Howshould we conductsystematicreviews of qualitative research? Qualitative EBP conference, coventryuniversity, may 14-16 2001.Petticrew and Roberts (2006, p128). Systematic Reviews in the Social Sciences: A practical guide. Oxford: Blackwell Publishing.

  30. Critical appraisal of qualitative research Background information: • Cochrane Qualitative Methods Group Guidance – Critical Appraisal Chapter (contains many more examples of critical appraisal checklists)

More Related