1 / 57

Quarterly Updates: Evidence Based Practice For Public Health Nurses Session III

This video explores the importance of critical appraisal of evidence in evidence-based practice for public health nursing. It covers the meaning of critical appraisal, levels of evidence, and methods used to appraise evidence. The objective is to develop an understanding of how to assess the strength and quality of scientific evidence for effective nursing practice decisions.

fvince
Download Presentation

Quarterly Updates: Evidence Based Practice For Public Health Nurses Session III

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quarterly Updates:Evidence Based Practice For Public Health NursesSession III Public Health Nursing Section, Evidence Based Practice Committee

  2. EVIDENCE BASED PUBLIC HEALTH NURSING PRACTICE CLINICAL EXPERTISE/ JUDGMENT CLIENT VALUES/ PREFERENCES KNOWLEDGE/ RESEARCH

  3. http://www.youtube.com/watch?v=2BzCXDZ1NRk

  4. “One of the greatest discoveries a man makes, one of his great surprises, is to find he can do what he was afraid he couldn’t do.” Henry Ford

  5. Objectives • Develop an understanding of the meaning of critical appraisal of evidence • Describe how levels of evidence are used in appraisal of evidence • Explore how other methods (e.g.; statistics) can be used to appraise evidence • Apply the process for appraising evidence to public health nursing

  6. Evidence Based Practice • Uses highest quality of knowledge in providing care to produce the greatest impact on health status and health care

  7. Critical Appraisal of Evidence • Key characteristic of evidence based practice • Core skill needed to use evidence to support nursing practice decisions

  8. Critical Appraisal of Evidence • Ensures relevance and transferability of evidence from the search to the specific population for whom the care will be provided

  9. Critical Appraisal of Evidence Defined • 1) Assessing the strength of the scientific evidence • 2) Evaluating the research for its quality and applicability to health care decision making

  10. 1) Strength of Evidence • Grading of strength of evidence should incorporate: • Quality • The extent to which bias was minimized (internal validity) • Quantity • The extent of the magnitude of effect, numbers of studies, and sample size or power. • Consistency • The extent to which similar and different study designs report similar findings

  11. 1) Strength of Evidence • Evidence exists on a continuum of rigor • Amount of research attention or maturity of science varies, therefore evidence varies • Type of research design reflects the strength of the evidence – known as levels of evidence Stevens & Ledbetter, 2000

  12. Levels of Evidence • Ranking as to how well the evidence informs clinical interventions • The stronger the level of evidence, the greater the confidence that the probability of applying the evidence in practice will be effective • Levels of evidence are based on research design Stevens & Ledbetter, 2000

  13. Levels of Evidence • Experts have developed a number of taxonomies to rate strength of evidence • Most are organized around research designs

  14. Levels of Evidence • National Guidelines Clearinghouse • Ia Evidence obtained from meta-analysis or systematic review of randomized controlled trials • Ib Evidence obtained from at least one randomized controlled trial • IIa Evidence obtained from at least one well-designed controlled study without randomization • IIb Evidence obtained from at least one other type of well-designed quasi-experimental study, without randomization • III Evidence obtained from well-designed non-experimental descriptive studies, such as comparative studies, correlation studies, and case studies • IV Evidence obtained from expert committee reports or opinions and/or clinical experiences of respected authorities

  15. Levels of Evidence • “Rating System for the Hierarchy of Evidence” • Level I: Evidence from a systematic review or meta-analysis of all relevant randomized controlled trials (RCTs), or evidence based clinical practice guidelines based ons systematic reviews of RCTs • Level II: Evidence obtained from at least one well-designed RCT • Level III: Evidence obtained from well-designed controlled trials without randomization (quasi-experimental) • Level IV: Evidence from well-designed case-control and cohort studies (studies of prognosis) • Level V: Evidence from systematic reviews of descriptive and qualitative studies • Level VI: Evidence form a single descriptive or qualitative study • Level VII: Evidence from the opinion of authorities and/or reports of expert committees (Melnyk & Fineout-Overholt, 2005)

  16. Levels of Evidence • RATING SYSTEM FOR LEVELS OF EVIDENCE • Type of evidence • I. Meta analysis or comprehensive systematic review of multiple experimental research studies (Cochrane , National Guidelines Clearinghouse (AHRQ), The Joanna Briggs Institute, Other groups) • II. Well designed experimental study • III. Well designed quasi-experimental study (Non-randomized controlled, Single group pre-post design, Cohort, Time series (one group of subjects over time), Matched case-controlled studies (two or more groups are matched on certain variables) • IV. Well designed non-experimental study (Correlational or comparative descriptive studies, Case study design, Qualitative studies) • V. Clinical examples and expert opinion (Text books, Non-research journal articles, Verbal report, Non-research based professional standards/guidelines/ • group article) • Strength of evidence • A. Type I evidence or consistent findings from multiple studies from levels II, III, or IV. • B. Multiple studies with evidence types II, III, or IV that are generally consistent. • C. Multiple studies with evidence types II, III, or IV that are inconsistent. • D. Limited research evidence or one type II study only. • E. Type IV or V evidence only Adapted from Joanna Briggs Institute and AHCPR Eilers & Heerman, 2005

  17. The U.S. Preventive Services Task Force (2008)

  18. Systematic Reviews • Provides state of the science conclusions about evidence supporting benefits and risks of a given healthcare practice (Stevens, 2001) • Most powerful and useful evidence available • Refers to summary that uses a rigorous scientific approach to combine results from a body of original research studies into a clinically meaningful whole Systematic Reviews & Meta Analysis

  19. Meta-Analysis • Statistical approach to synthesizing the results of a number of studies – summarizes results of all studies included in the review • Produces a larger sample size and thus greater power to determine the true magnitude of an effect, yields a summary statistic Systematic Reviews & Meta Analysis

  20. Randomized Controlled Trial • Experimental studies are the gold standard of research design (randomization of participants to treatment and control, rigorous methods used to minimize bias) • Provides most valid, dependable research conclusion about clinical effectiveness of an intervention and establishing cause and effect • Allows us to say with a high degree of certainty that the intervention we used was the cause of the outcome Randomized Controlled Trials Systematic Reviews & Meta Analysis

  21. Quasi-Experimental • Differs from RCT’s only in that participants are NOT randomized to treatment and control groups Quasi-Experimental Systematic Reviews & Meta Analysis Randomized Controlled Trials

  22. Non-Experimental • Cohort – participants are studied over time, study population shares common characteristics • Case-Control – studies that address questions about harm or causation, investigates why some people develop a disease or behave the way they do vs others who do not • Descriptive – main objective is to describe some phenomena • Qualitative -"any kind of research that produces findings not arrived at by means of statistical procedures or other means of quantification" (Strauss and Corbin, 1990, p. 17). Non-Experimental Systematic Reviews & Meta Analysis Randomized Controlled Trials Quasi-Experimental

  23. . Clinical Examples & Expert Opinion • Expert Opinion – arriving at a value judgement which incorporates the main information available on the subject as well as previous experiences • Clinical examples – • The “5 rights” Clinical Examples & Expert Opinion Systematic Reviews & Meta Analysis Non-Experimental Quasi-Experimental Randomized Controlled Trials

  24. 2) Evaluating Quality & Applicability • What are the results? • Are the results valid? • Can the results be applied to the targeted population and/or public health practice and intervention?

  25. What are the results? • Were the results similar from study to study (if systematic review or meta-analysis)? • What are the overall results? • How precise were the results? • Can a causal relationship be inferred from the data?

  26. Are the Results Valid? • Does this article explicitly address our public health question? • Was the search for our article detailed and exhaustive? Is it likely that important, relevant studies were missed? • Does the study selected appear to be of high methodological quality? • Do you feel the study selected is reproducible?

  27. Is the Evidence Applicable? • How can the results be interpreted and applied to public health practice and intervention? • Are study subjects similar to clients to whom care is to be delivered? • Were all important outcomes considered? • Are the benefits worth the costs and potential risks?

  28. Other Methods Used to Appraise Evidence • Fineout-Overholt, E., & Melynk, B.M. (2004). Evaluation of studies of prognosis. Evidence Based Nursing, 7, 4-8. • Melynk, B.M. (2003). Finding and appraising systematic reviews of clinical interventions: Critical skills for evidence-based practice. Pediatric Nursing, 29(2), 147-149. • Melynk, B.M., & Fineout-Overholt, E. (2005). Rapid critical appraisal of randomized controlled trials: An essential skill for evidence-based practice. Pediatric Nursing, 31(1), 50-52. • Melynk, B.M., & Fineout-Overholt, E. (2002). Key steps in implementing evidence-based practice: Asking compelling, searchable questions and searching for the best evidence. Pediatric Nursing, 22(3), 262-266.

  29. Other Methods Used to Appraise Evidence • Statistical Evaluation, for example calculating effect size • Effect size measures the magnitude or strength of the treatment or intervention effect (how well the intervention worked in the group who received the intervention vs the group that did not receive the intervention) • Small, medium and large effects are designated as .2, .5, and .8 respectively • Several formulas to use depending on statistical analysis used (e.g.; t-tests, etc) • Thalheimer, W., & Cook, S. (2002, August). How to calculate effect sizes from published research articles: A simplified methodology. Retrieved April 29, 2009 from http://www.work-learning.com/white_papers/effect_sizes/Effect_Sizes_pdf5.pdf

  30. Other Methods Used to Appraise Evidence • AGREE instrument (AGREE Collaboration, London) • http://www.agreecollaboration.org/instrument/ • AGREE is an international collaboration of researchers and policy makers who seek to improve the quality and effectiveness of clinical practice guidelines by establishing a shared framework for their development, reporting and assessment – shared framework is the AGREE instrument (Appraisal of Guidelines for Research & Evaluation). • CATmaker (Centre for Evidence Based Medicine, Oxford, U.K.) • http://www.cebm.net/index.aspx?o=1216 • Is a software tool which helps you create Critically Appraised Topics, or CATs, for articles found when searching for evidence (free download) • Rapid (Joanna Briggs Institute, University of Adelaide, Australia) • http://www.joannabriggs.edu.au/services/rapid.php • On-line critical appraisal of evidence training program • Rap Maker – is a program to appraise a study, its methods, findings and applicability. RAP maker facilitates construction of a final report, which may then be submitted on-line to the RAPid library for independent critique, then uploading for world wide access.

  31. Search evidence rich resources first

  32. EBP Rich Resources for P/CHN • Cochrane review http://www.cochrane.org/reviews/ • DARE – Database of Abstracts of Reviews of • Effectiveness http://www.mrw.interscience.wiley.com/cochrane/cochrane_cldare_articles_fs.html

  33. Agency for Healthcare Research and Quality (AHRQ) • National Guidelines Clearinghouse www.guidelines.gov • Guide to Clinical Preventive Services (2008) http://www.ahrq.gov/clinic/pocketgd.htm • Evidence reports ahrq www.ahrq.gov/clinic/epcix.htm

  34. EBP Rich Resources for P/CHN • Guide to Community Preventive Services • http://www.thecommunityguide.org/index.html

  35. Centers for Disease Control & Prevention • CDC for Public Health Professionals http://www.cdc.gov/CDCForYou/public_health_professionals.html

  36. Association of State and Territorial Health Officials • Evidence Based Practice • http://www.astho.org/?template=evidence_based_ph_practice.html

  37. National Association of City and County Public Health Officials • The Model Practices Database • http://www.naccho.org/topics/modelpractices/ • http://archive.naccho.org/modelPractices/ • Online searchable collection of practices across public health areas. • Allows you to benefit from colleagues' experiences, to learn what works, and to ensure that resources are used wisely on effective programs that have been implemented with good results. • The database features practices in the following areas: • Community Health • Environmental Health • Public Health Infrastructure • Emergency Preparedness

  38. EBP Rich Resources for P/CHN • Health Services/Technology Assessment Text (HSTAT) • http://hstat.nlm.nih.gov • Searchable collection of large, fulltext practice guidelines, technology assessments and health information

  39. EBP Rich Resources for P/CHN • Health Policy Guide • http://www.healthpolicyguide.org/ • evidence-based policies to improve the public’s health • 150 policy topics to support advocacy and decision making at the state and local levels

  40. EBP Rich Resources for P/CHN • http://guides.nursinglibrary.yale.edu/content.php?pid=14371&sid=96991 • National Institute for Health & clinNICE is an independent organisation responsible for providing national guidance on promoting good health and preventing and treating ill health.

  41. Evidence Based Public Health Nursing • http://www.uic.edu/depts/lib/projects/ebphn/ • http://www.uic.edu/depts/lib/projects/ebphn/modulesmain.html • http://www.uic.edu/nursing/aphne/

  42. EBP Rich Resources for P/CHN

  43. Application Exercise • PICO QUESTION: • For the 4 year old pre-K age group, are there fewer injection site complications with giving the immunizations in the thigh as compared to giving the immunizations in the arm?

  44. Cochrane Review • Tinnion O, Hanlon M. Acellular vaccines for preventing whooping cough in children. Cochrane Database of Systematic Reviews 1999, Issue 2. Art. No.: CD001478. DOI: 10.1002/14651858.CD001478.pub2 • “…Differences in trial design precluded pooling of the efficacy data and results should be interpreted with caution. Most systemic and local adverse events were significantly less common with acellular than with whole cell pertussis vaccines….” • Emailed page to print off

  45. National Guidelines Clearinghouse • 1) General recommendations on immunization: recommendations of the Advisory Committee on Immunization Practices (ACIP). 2) Update: recommendations from the Advisory Committee on Immunization Practices (ACIP) regarding administration of combination MMRV vaccine. http://www.guidelines.gov/summary/summary.aspx?doc_id=12325&nbr=006390&string=vaccine+AND+administration+AND+site+AND+route

  46. National Guidelines Clearinghouse • Injection Route and Injection Site • With the exception of Bacillus Calmette-Guerin (BCG) vaccine, injectable vaccines are administered by the intramuscular and subcutaneous route. The method of administration of injectable vaccines is determined, in part, by the presence of adjuvants in some vaccines. The term adjuvant refers to a vaccine component distinct from the antigen that enhances the immune response to the antigen. The majority of vaccines containing an adjuvant (e.g., DTaP, DT, Td, Tdap, PCV, Hib, HepA , HepB, and HPV) should be injected into a muscle because administration subcutaneously or intradermally can cause local irritation, induration, skin discoloration, inflammation, and granuloma formation.

  47. National Guidelines Clearinghouse • Routes of administration are recommended by the manufacturer for each immunobiologic. Deviation from the recommended route of administration might reduce vaccine efficacy or increase local adverse reactions.

  48. CDC: Advisory Committee on Immunization Practices • Route • Administering a vaccine by the recommended route is imperative. Deviation from the recommended route of administration might reduce vaccine efficacy or increase the risk of local reactions. (p. D5)

  49. CDC: Advisory Committee on Immunization Practices • Site • Although there are several IM injection sites on the body, the recommended IM sites for vaccine administration are the vastuslateralis muscle (anterolateral thigh) and the deltoid muscle (upper arm). The site depends on the age of the individual and the degree of muscle development. • The usual sites for vaccine administration subcutaneously are the thigh (for infants <12 months of age) and the upper outer triceps of the arm (for persons >12 months of age). If necessary, the upper outer triceps area can be used to administer subcutaneous injections to infants.

More Related