341 likes | 2.52k Views
Comparing Empirically Supported Treatments and Evidence-Based Practice. Brian C. Chu, Ph.D. Rutgers University, GSAPP Email: BrianChu@rci.Rutgers.edu. What is Evidence-Based Treatment? How do you define EB?. Guidelines: Often endorsements of an official govt agency Consensus Statements
E N D
Comparing Empirically Supported Treatments and Evidence-Based Practice Brian C. Chu, Ph.D. Rutgers University, GSAPP Email: BrianChu@rci.Rutgers.edu
What is Evidence-Based Treatment? How do you define EB? • Guidelines: • Often endorsements of an official govt agency • Consensus Statements • Consensus summaries of the evidence by experts in the field. • Merit Behavioral Care Corp (1997): “Consistent with national standards, the Medical Affairs Committee of MBC endorses clinical practice guidelines.” • Consumer/practitioner preferences • Research-based conclusions
Empirically-supported Treatments • APA Task Force on Psychological Intervention Guidelines • 1995 Task Force on the Promotion and Dissemination of Psychological Procedures (Div 12; Clinical Psychology) • Goals: • Charged with identifyingstate of the science regarding clinical interventions. • Consider issues in dissemination of psychological treatments of known efficacy. • Background: • Evidence-based medicine (Sackett et al., 1997) • (a) patient care can be enhanced by acquisition and use of up-to-date empirical knowledge • (b) it is difficult for clinicians to keep up with newly emerging information relevant to practice • (c) But if they do not, knowledge and clinical performance will deteriorate • (e) So, clinicians need summaries of evidence and ability to access information.
1995 Task Force Products • Task Force (1995); Chambless, 1996; 1998 • 1995: Established Criteria • 1995: Identified a preliminary list of 25 ESTs • 1998: List had grown to 71 treatments • 1999: Div 12 took full ownership of maintaining list (APA declined) • Maintain an ongoing list and information center: http://www.apa.org/divisions/div12/cppi.html • Additional Task Forces • Life Span perspective: Spirito (1999; J Pediatric Psych) • Youth: Lonigan (1998; J Clinical Child Psych) • Div 12: Nathan & Gorman (1998; A Guide to Treatments that Work) • Adult, child, marital, family therapy: Kendall & Chambless (1998, JCCP): • Elderly: Gatz (1998; J. Mental Health & Aging)
Criteria for Empirically-Supported Treatments (Div 12 Task Force; see Chambless & Ollendick, 2001) • Well-established Treatments • At least two good between-group design experiments must demonstrate efficacy: • Tx superior to pill or psychological placebo • Tx equivalent to an already established treatment. OR: • A large series of single-case design experiments with: • Use of good experimental design and • Comparison of intervention to another treatment • Other criteria: • Tx manual must’ve been used. • Characteristics of the sample must’ve been clearly delineated. • Effects must be demonstrated by at least 2 different investigators or teams
Criteria for Empirically-Supported Treatments (Cont.) • Probably Efficacious Treatments • 2 experiments show Tx is superior to WL OR: • Meet all criteria of “Well-established” but only by 1 research team OR: • A small series of single-case design experiments • Experimental Treatments: • Treatments not yet tested • Treatments not yet meeting criteria
Concerns with Results from 1995 Task Force • Sample (patient) definition relies on DSM? • Emphasis on Randomized Clinical Trials (RCTs) • Treatments on EST list are mostly behavioral • Emphasis on Treatment Manuals • Insufficient evidence of how ESTs transport to traditional clinical settings: • Efficacy vs. Effectiveness
Efficacy vs. Effectiveness • Efficacy • Assessment of outcomes in more controlled settings • Weisz et al (1995; “Bridging the Gap;” JCCP) • Research • Clients recruited, homogenous, narrow problem focus • Lab or school settings • Therapists specialized and intensive training • Clinic • Clients are natural referred (more severe, complicated), heterogeneous • Clinic or hospital settings • Therapists have large case loads with rare specialized training. • Effectiveness: • Do research findings generalize… • To ordinary clinical settings? • To ordinary clients?
Punctuated Continuum between Efficacy and Effectiveness:(Chorpita, 2003)
Evidence-Based “Practice” • 2005 APA Presidential Task Force on Evidence-Based Practice • Goals: • To integrate science and practice, but consider full rangeof evidence that policy-makers must consider. • Background: • Also aligned with evidence-based medicine (Sackett et al., 1997) • “the conscientious, explicit, and judicious use of current best evidence in making decisions about care of individual patients.” • Advocates for improved patient outcomes by informing clinical practice with relevant research • Efficacy vs. Effectiveness vs. Clinical Utility: • Efficacy: the systematic and scientific evaluation of whether a treatment works. • Effectiveness: the extent to which treatment effects extend to natural clinic settings with natural client populations. • Clinical utility: the applicability, feasibility, and usefulness of the intervention in the local or specific setting where it is offered.
2005 Task Force Products • Position statements: • APA Task Force (2005; American Psychologist) • Report of the 2005 Presidential Task Force on EBP • Definitions: Evidence-based Practice in Psychology (EBPP) • Integration of the best available research with clinical expertise in the context of patient characteristics, culture, and preferences • To promote effective psychological practice and enhance public health by applying empirically supported principles of psychological assessment, case formulation, therapeutic relationship, and intervention. • Relation b/t EST and EBPP • EBPP is more comprehensive • EST starts with treatment and asks whether it works for certain dx or problem. • EBPP starts with patient and asks what research evidence applies. • EBPP articulates a decision-making process for integrating multiple streams of research evidence • Pros and Cons of Each? • How do you define Clinical Expertise?
Establishing Efficacy and Effectiveness • Hierarchical order for types of research evidence in their contribution to conclusions about efficacy • In ascending order of value: • Clinical opinion and observation is insufficient. • Case studies • Quasi-experimental studies • Controlled single case design studies • Uncontrolled effectiveness studies (with bench marking) • Quasi-experimental effectiveness studies • Randomized controlled efficacy studies • Randomized controlled effectiveness studies • RCT’s are the most stringent way to evaluate treatment efficacy because they most effectively rule out threats to internal validity in a single experiment.
Best Research Evidence • Each research design is suited to different types of questions: • Clinical Observation & basic science • Sources of innovations and hypotheses • Qualitative Research • To describe subjective experiences • Systematic case studies • When aggregated, can compare individual pts with others. • Single-case Experimental Designs • To establish causal relationships in context of an individual. • Public Health and Ethnographic Research • To track availability, utilization, and acceptance of services
Best Research Evidence (Cont) • Process-Outcomes studies • To identify mediators and mechanisms of change • Studies of interventions in naturalistic settings • To assess ecological validity of treatments • Randomized Clinical Trials (and logical equivalents) • To draw causal inferences about effects of interventions • Meta-Analysis • To synthesize results from multiple studies, test hypotheses, and estimate ES
The Role of Clinical Expertise • Clinical Expertise: Definition? • 2005 TF: Competence attained by psychologists through education, training, and experience that results in effective practice. • Role of Clinical Expertise? • Necessary for identifying and integrating best research evidence with clinical data (e.g., info obtained from pt). • Components of Clinical Expertise: • Assessment, diagnostic judgment, case formulation, tx planning • Clinical decision making, treatment implementation, and monitoring of patient progress • Interpersonal expertise • Continual self-reflection and acquisition of skills • Appropriate evaluation and use of research evidence in both basic and applied psychological science • Understanding the influence of individual and cultural differences on treatment. • Seeking available resources (consultation, adjunctive services) as needed • Having a cogent rationale for clinical strategies
Why is integrating evidence into practice important? • Individual client/patient level • Maximize clinical benefits • Individual Therapist/Clinician Level • Accountability: Use a treatment that’s effective and cost-effective • Makes your job as a clinician easier • Make use of research that’s been done • Narrows the choices you need to make • Ethics • Is it malpractice if you don’t use EBTs? • 1994: Hawaii settled a class action lawsuit, to ensure the MH needs of students receiving public education (Felix Consent Decree)
Why would you use EBTs (Cont)? • Why, if you were in charge of a clinic (clinic level)? • Keeps them competitive • Reimbursable • Why if you are making recommendations to your colleagues (Psychological Field level)? • Establishes as part of health care field • Justification for funding • Local or National Policy Level • Ensures better health care for population
How could you as an individual therapist integrate Science and Practice? • Stay up on the EST literature • Implement treatments with empirical support • APA Div 12 maintains an ongoing list and information center: http://www.apa.org/divisions/div12/cppi.html • Adopt scientist-practitioner approach • Hypothesis-testing approach • Monitor outcomes throughout treatment