490 likes | 690 Views
Evidence-based Education: Can We Get There From Here?. Ronnie Detrich Wing Institute Association for Behavior Analysis International Evidence-based Education Conference September 6, 2008. Why Do We Need Evidence-based Education?. From a university in the U.S. Randy Keyworth Jack States
E N D
Evidence-based Education: Can We Get There From Here? Ronnie Detrich Wing Institute Association for Behavior Analysis International Evidence-based Education Conference September 6, 2008
Why Do We Need Evidence-based Education? From a university in the U.S.
Randy Keyworth Jack States Tom Critchfield Tim Slocum Mark Shriver Teri Lewis-Palmer Karen Hager Janet Twyman Hill Walker Susan Wilczynski Acknowledgements
Why Evidence-based Education? • Federal policy emphasizes scientifically based instruction. • No Child Left Behind • Over 100 references to scientifically based instruction. • Individuals with Disabilities Education Improvement Act • Pre-service and professional development should prepare educators to implement scientifically based instructional practices.
Why Evidence-based Education? • Professional organizations began validating interventions as evidence-based: • Mid 1990’s • Society for the Study of School Psychology • American Psychological Association • More recently • What Works Clearinghouse (Institute for Education Science) • Campbell Collaboration • Coalition for Evidence-based Policy • National Autism Center
Why Evidence-based Education? • Most professional organizations have ethical guidelines emphasizing services are based on scientific knowledge. • American Psychological Association • Psychologists’ work is based on the established scientific and professional knowledge of the discipline. • National Association of School Psychologists • … direct and indirect service methods that the profession considers to be responsible, research-based practice. • The Behavior Analyst Certification Board • The behavior analyst always has the responsibility to recommend scientifically supported, most effective treatment procedures.
What is Evidence-based Practice? • At its core the EBP movement is a consumer protection movement. • It is not about science per se. • It is a policy to use science for the benefit of consumers. • “The ultimate goal of the ‘evidence-based movement’ is to make better use of research findings in typical service settings, to benefit consumers and society….”(Fixsen, 2008)
What is Evidence-based Practice? • Evidence-based practice has its roots in medicine. • Movement has spread across major disciplines in human services: • Psychology • School Psychology • Social Work • Speech Pathology • Occupational Therapy
What Is Evidence-based Practice? • EBP is a decision-making approach that places emphasis on evidence to: • guide decisions about which interventions to use; • evaluate the effects of an intervention. Professional Judgment Best available evidence Client Values Sackett et al (2000) Client Values Best Available Evidence Professional Judgment
What is Evidence-based Education? • The term “evidence-based” has become ubiquitous in last decade. • There is no consensus about what it means. • At issue is what counts as evidence. • Federal definition emphasizes experimental methods. • Preference for randomized trials. • Definition has been criticized as being positivistic.
What Counts as Evidence? • Ultimately, this depends on the question being asked. • Even behavior analysis allows for qualitative evidence (social validity measures). • In EBP the goal is to identify causal relations between interventions and outcomes. • Experimental methods do this best.
What Counts as Evidence? • Even if we accept causal demonstrations to be evidence, we have no consensus. • Randomized Clinical Trials (RCT) have become the “gold standard.” • There is controversy about the status of single subject designs. • Most frequently criticized on the basis of external validity.
How Are Evidence-based Interventions Identified? • Identification is more than finding a study to support an intervention. • Identification involves distilling a body of knowledge to determine the strength of evidence.
How Are Evidence-based Interventions Identified? • Distillation requires standards of evidence for reviewing the literature. • Standards specify: • the quantity of evidence • the quality of evidence
Current “Gold Standard”High QualityRandomized Controlled Trial Meta-analysis (systematic review) Single Case Designs Repeated Systematic Measures Semi-Randomized Trials Single Case Replication (Direct and Parametric) Well-conducted Clinical Studies Threshold of Evidence Convergent Evidence Uncontrolled Studies Expert Opinion Various Investigations General Consensus Personal Observation Single Study Continua of Evidence Quantity of the Evidence Quality of the Evidence Janet Twyman, 2007
How Are Evidence-based Interventions Identified? • Two approaches to validating interventions • Threshold approach: • Evidence must be of a specific quantity and quality before an intervention is considered evidence-based. • Hierarchy of evidence approach: • Strength of evidence falls along a continuum with each level having differential standards.
How Are Evidence-based Interventions Identified? • There are no agreed upon standards. • It is possible for an intervention to be evidence-based using one set of standards and to fail to meet evidence standards using an alternative set. • Difficult for consumers and decision makers to sort out the competing claims about what is evidence-based.
Evidence-based Intervention Evidence-based Intervention
Actual Effectiveness Assessed Effectiveness Effective Ineffective Effective Ineffective Effective Effective Most likely with hierarchy approach True False Positive Positive Most likely with threshold approach False True Negative Negative Ineffective Ineffective
Choosing Between False Positives and False Negatives • At this stage, it is better to have more false positives than false negatives. FalseNegatives: Effective interventions will not be selected for implementation. • As a consequence, less likely to determine that they are actually effective. FalsePositives: Progress monitoring will identify interventions that are not effective.
Why Do We Need Evidence-based Education? • Kazdin (2000) identified 550 named interventions for children and adolescents. • A very small number of these interventions have been empirically evaluated. • Of those that have been evaluated, the large majority are behavioral or cognitive-behavioral. • Evidence-based interventions are less likely to be used than interventions for which there is no evidence or there is evidence about lack of impact.
Research to Practice Evidence-based Education Roadmap Research Replicability Practice
Research What works? Replicability When does it work? How do we make it work? Sustainability Is it working? Practice Efficacy Research(What Works?) What works? • Primary concern is demonstrations of causal relations. • Rigorous experimental control so threats to internal validity are minimized. • Not always easy to immediately translate to practice.
Behavior Analysis and Efficacy • Behavior Analysis: emphasis on rigorous experimental control has resulted in many important contributions to education. • Systematic, explicit teaching methods. • Wide spread use of reinforcement systems.
Research to Practice Evidence-based Education Roadmap Research Replicability Practice
Research What works? Replicability When does it work? When does it work? How do we make it work? Sustainability Is it working? Practice Effectiveness Research(When Does it Work?) • Evaluates the robustness of an intervention when “taken to scale” and implemented in more typical practice settings. • Answers questions related to external validity or generalizability of effects. • Typically, smaller effect size. • Efficacy and effectiveness fall on a continuum.
Behavior Analysis and Effectiveness Research • Behavior Analysis has not generally concerned itself with external validity questions. • Emphasizes generality of behavioral principles. • Has not resulted in the type of research that answers the “actuarial” questions asked by effectiveness research. • What percent of population of students will benefit from a specific program? • Which students will benefit?
Research to Practice Issues • The lag time from efficacy research to effectiveness research to dissemination is 10-20 years. (Hoagwood, Burns & Weisz, 2002) • Only 4 of 10 Blueprint Violence Prevention programs had the capacity to disseminate to 10+ sites in a year. (Elliott & Mihalic, 2004)
Good Behavior Game: Efficacy • First efficacy study: fourth grade classroom (Barrish, Saunders, Wolf, 1969) • Subsequent replications across: • Settings (The Sudan, library, sheltered workshop) • Students (general education, special education, 2nd grade, 5th grade, 6th grade, adults with developmental disabilities) • Behaviors (on-task, off, task, disruptive, work productivity) • All efficacy studies were single subject designs.
Good Behavior Game: Effectiveness • Series of effectiveness studies by Kellam et al. examining it as a prevention program: • Special issue of Drug and Alcohol Dependence (2008) • If exposed to GBG in 1st and 2nd grade then reduced risk for young adults of: • drug/alcohol abuse • smoking • anti-social personality disorder • subsequent use of school-based services • suicidal ideation and attempts • All studies were RCTs.
Good Behavior Game: Validation • Coalition for Evidence-based Policy reviewed the literature for Good Behavior Game: • Determined it was evidence-based. • Review included only those studies that were RCT. • All single subject research was ignored.
A Consumer Perspective: One Year Follow-up “…you should give them more good behavior game. Keep on doing what’s good.”
Research to Practice Evidence-based Education Roadmap Research Replicability Sustainability Practice
Research What works? Replicability When does it work? How do we make it work? How do we make it work? Sustainability Is it working? Practice Implementation(How do we make it work?) • “Identifying evidence-based interventions is one thing, implementing them is an entirely different thing.” (Dean Fixsen, 2008) • The primary challenge is how to place an intervention within a specific context. • Until implementation questions are answered, the ultimate promise of evidence-based education will go unfulfilled.
Implementation is Fundamental 80% of initiatives ended within 2 years 90% of initiatives ended within 4 years Data from Center for Comprehensive School Reform
Behavior Analysis and Implementation • Service delivery in behavior analysis is a mediated model. • Requires behavior analysts to address many of the issues of implementation for each project. • We have not systematically attended to many of these issues, especially at large scale. • What organizational features are necessary to support evidence-based interventions? • How do we modify an intervention so it fits local contingencies without diminishing effectiveness.
Research to Practice Evidence-based Education Roadmap Research Replicability Sustainability Practice
Research What works? Replicability When does it work? How do we make it work? Sustainability Is it working? Is it working? Practice Progress Monitoring(Is it Working?) • Research guides us to interventions that are most likely to work. • Generalizing from a research base to a specific instance requires a leap of faith and confidence < 1.0. • Assures that an intervention is actually effective in a setting (practice-based evidence).
Behavior Analysis and Progress Monitoring • Progress monitoring is the sine qua non of applied behavior analysis. • It is not applied behavior analysis if data are not collected and reviewed. • Behavior analysis has made enormous contributions to the direct measurement of behavior. • Represents the best example of practice-based evidence about evidence-based practices.
Research to Practice Evidence-based Education Roadmap Research Replicability Sustainability Practice
Similarities and Differences Between Behavior Analysis and Evidence-based Practice Evidence-based Practice Behavior Analysis Unit of analysis is populations Unit of analysis is individual Data-based decision making Evidence is derived from systematic reviews Evidence is derived from experiments Assumption that science produces best outcomes for consumers Practitioner must know laws of behavior and how to apply Practitioner must know how to implement effectively
Academic Systems Behavioral Systems • Intensive, Individual Interventions • Individual Students • Assessment-based • High Intensity • Intensive, Individual Interventions • Individual Students • Assessment-based • Intense, durable procedures • Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Universal Interventions • All students • Preventive, proactive • Universal Interventions • All settings, all students • Preventive, proactive A Prevention Model for Evidence-based Education 1-5% 1-5% 5-10% 5-10% 80-90% 80-90%
Can We Get There From Here? • Behavior analysis has a great deal to contribute to the discussion about the most effective educational interventions. • The current emphasis on RCT puts behavior analysis in a difficult position. • If we are to have maximum impact on the field of education then we must change our behavior. • “If you are not at the table, then you are on the menu.” (Cathy Watkins, 2008)
Can We Get There From Here? • We should begin to conduct RCTs. • If we have robust interventions, they will fare well with RCT. • RCTs are well suited to answer actuarial questions. • Decision makers are concerned with these actuarial questions. • “How big a bang will I get for my buck?”
Can We Get There From Here? Sidman, The Behavior Analyst, 2006: “To make the general contributions of which our science is capable, behavior analysts will have to use methods of wider generality, in the sense they affect many people at the same time- or within a short time, without our being concerned about any particular members of the relevant population.”
Can We Get There? • We should not abandon rigorous single subject research. • Expand our repertoire to include other methods to answer different types of questions. • Engage in a social influence process to assure that SSDs are included in evidence standards. • Especially critical in special education context.