520 likes | 628 Views
The Survival of Adventure Therapy: Profession at a Crossroads. Adventure Therapy Best Practices Conference Durham, NH Michael Gass Department of Kinesiology University of New Hampshire. Adventure Tx field: Field of “opportunistic existenceâ€.
E N D
The Survival of Adventure Therapy:Profession at a Crossroads Adventure Therapy Best Practices Conference Durham, NH Michael Gass Department of Kinesiology University of New Hampshire
Adventure Tx field:Field of “opportunistic existence” • 1901 - Tent therapy at Manhattan State Hospital East to isolate TB patients from other patients • 1980s - Rapid growth of challenge courses and training in adolescent psychiatric hospitals (connected to rise of inpatient psychiatry) (e.g., Charter Hospitals) (Michael Stratton fund!) • 1990 - Expansion in number and size of wilderness therapy programs (e.g., formation of OBHIC in 1999) (connected to the fall of inpatient psychiatry)
Opportunisitic growth of current Wilderness Therapy:Thanks to HMOs and Prozac?Past factors • Demise of psychiatric facilities due to strength of insurance and pharmaceutical companies (AT symptom - cutting down trees and telephone poles of “experiential therapy” challenge courses overnight) • Reacting to costs and sheer greed, insurance companies restricted length of stay to the point that psychiatric hospitals became strictly short-term, palliative treatments for acute suicidal patients (Santa, 2007) • Treatment focus at the neurotransmitter level - introduction of Prozac in 1985 with fewer side effects and mood altering drug via serotonin level available at the synapse (Santa, 2007)
HMOs and Prozac “leftovers”Then what? • Many adolescents did not respond well to replacement of community based, wrap around model fueled by ample meds and straightforward social skills training • Failure of treatment programs created “era of desperation” to “era of integrated continuity of care” and rapid increase in wilderness therapy programs (1995 to present) • 18,000 young people in NATSAP programs alone in 2005 (Santa, 2007)
AEE’s investment:Parallel Process of Accreditation • 1990s - Expansion in number and size of wilderness therapy programs (e.g., formation of OBHIC in 1999) (connected to the fall of inpatient psychiatry) • 1990 deaths of Michelle Sutton and Kristin Chase, 1994 death of Aaron Bacon • 1990s - Williamson & Gass produce first AEE Accreditation standards with subsequent editions • 1993 - First adventure program accredited
Evolution of AEE Accreditation Deaths, access, fear of the loss of self-governance led to awareness, “call to save the field” AEE & 24 sponsoring programs accelerated the decision making process with money and huge personal investment on many Make/force “bottom 20% of programs” to become better Preparation - went the humanistic, sharing, “painless” route Great spinoffs (e.g., TAPG ethical guidelines, books), but a case of the “top 20% of programs” getting better.” Action - maintenance??? Believe in common good may only take us so far.
Winds of Change during first week in March, 2007 • SAMHSA website next version • Front page of Education section of the NY Times “In War Over Teaching Reading, a U.S.-Local Clash” • NICE guidelines in Great Britain “…what it would take for outdoor therapy to be on NICE guidelines and thought the task of this is incredibly daunting, but as a vision for outdoor therapy/ outdoors. It is what we should be inspiring too - even if it feels like it is another few decades away! It is a serious agenda that we cannot hide from - the future of sustaining practice is partly hinged on this.” Kaye Richards, 03/07/07
Changing Paradigm Leaving open ended, opportunistic paradigm to… Evidenced-based “choice paradigm”
“Choice of Drug” paradigm: What do you choose? • Scientifically based evidence backing the effectiveness of a drug with proven results, or a drug that has shown no effectiveness? • Drug that costs $400 or one that costs $1000? • Drug that is the same no matter where you take it or who gives it to you, or one that does/may change with administration?
“Choice of Drug” paradigm: You choose… • One with documented, unbiased evidence, with multiple tests done by different researchers • One that is cost effective (and you can afford) • One with fidelity, or does not change with who administers it to you. • Welcome to the “crossroads” of the evidenced-based practice paradigm and our source of flourishing and choosing the way we practice, or ????
Call to moral compass • “Because that isn’t where it is at; it’s back in the city, back in downtown St. Louis, back in Los Angeles. The final test is whether your experience with the sacred in nature enables you to cope more effectively with the problems of man (sic). If it doesn’t enable you to cope more effectively with the problems…then when that happens by my scale of value it’s failed.” Unsoeld, W. (1974) - Spiritual Values of the Wilderness Keynote Address, AEE Estes Park Keynote “Why don’t we stay in the Wilderness?”
Report card on AT EBP movement • Novel, fresh ideas were funded • Rocky Kimball “comeback” response • Lack of judging paradigms in education and mental health professions • Actions of TAPG??…. • No longer is this the case….
Affects on other approaches/programsSearch for the actual “truth” or “outcomes” of a well-designed and effective programs • David Barlow (APA) (2004) landmark article: • In the 1990s large amounts of money with little supporting evidence was invested into programs addressing youth and adult violence that simply didn’t work. • In some cases these intervention programs created more harm than no program at all.
Samples of well-known, ineffective programs • 1990s for the emergence of ineffective but popular programs • (1) Gun Buyback programs - two-thirds of the guns turned in did not work, almost all of the people turning in guns had another gun at home) • (2) Bootcamp programs (failed to provide any difference in juvenile recidivism outcome rates than standard probation programs, but were four times as expensive.
Ineffective Programs continued • (3 ) DARE programs - traditional 5th grade program failed to be effective in decreasing drug use despite the fact that by 1998 the program was used in 48% of American schools with an annual budget of over $700 million dollars (Greenwood, 2006). • (4) Scared Straight programs - inculcated youth more directly into a criminal lifestyle, actually leading to increases in crime by participating youth and required $203 in corrective programming to address and undo every dollar that was originally spent on programming.
Evidence means more that outcomes: cost-effectiveness measures (e.g., taxes) • With programs that work, • can you show a “bottom line” net gain? • & deliver consistent, quality programs? • Dr. Steve Aos, WSIPP http://www.wsipp.wa.gov/default.asp
March 2004 predictions on success of AT in 2010 …have documented research on what tx does and does not do …have stronger risk management systems, particularly when screening for which program for which client …have indicators (e.g., best practices, accreditation) of what quality programs are and what they are not …match external organizations and government “value systems”
March 2004 predictions on success of AT in 2010 (5)…adapt to be more applicable for the changing demographics of America (6)…become more recognizable to the public, being differentiated from other applications of adventure programming (7)…understand where programming “fits” along a client’s continuity of care
Building the Adventure Programming Research House • Many parts to building a house • Each serve critical needs (e.g., basement - foundation) • One purpose of the house roof is to protect all of the elements of the house from the oncoming storm
Elements of building a roof • Need to build the right roof for the right conditions to produce the desired results • Need the right equipment • At the end of the project, the housing inspector will come to examine the roof and let you know if it meets established code or not
“Legacy” of previous “roof builders” with AE research • One-shot efforts • Quantoid takeover? • Keith’s 2006 SEER analysis (3 of 14 in JEE) • limited number in AP in Education and therapy • Self-concept based • Limited power • Overgeneralization • Not only need for demonstrating effectiveness, but differentiating from other findings
LEGACY Example • Counter Jones, Lowe, and Risler (2004) found a wilderness camping/group home research findings • BMtA participants possess significantly less recidivism over a three year period than participants in OTP and YDC • Cost savings of over $150,000 per participant
Matches a form of Evidenced-based research evaluation Blueprints DOE SAMHSA Thirteen elements of EBP AP Research
(2) Provides Case studies or clinical samples Illustrates actual clinical examples (especially in time series designs) Actual examples protecting client identifying characteristics Thirteen elements of EBP AP Research
(3) Experimental Design RCT Quasi-experimental with appropriate comparison group(s) and equal n’s Watch violations of test assumptions Thirteen elements of EBP AP Research
(4) Benefit-Cost Analysis Benefit & costs ratio combined Compared to other programs Understandable to clients in terms of savings (e.g., Aos) Thirteen elements of EBP AP Research
(5) Results reporting Significance testing Effect sizes Benefit-cost analysis Other meaningful reporting structures (e.g., survival curves) Thirteen elements of EBP AP Research
Kaplan-Myer Survival Curve Analysis for Legacy participants over three (3) years (under review)
(6) Training models Clear Uniform Tested Methods of validating/certifying/ licensing adherence to model Thirteen elements of EBP AP Research
(7) Power of research design - NAROPA - Power calculation(1-B) N - increase your n A - relax your alpha level (.10, not .05) R - stronger reliability O - one tailed (or directional test) P - potency of treatment is increased A - analysis strategy enhanced Thirteen elements of EBP AP Research
(8) Proper Instrumentation the “highest value” in the population being analyzed possesses well established & high levels of validity and reliability appropriate for client group strong levels of objectivity Thirteen elements of EBP AP Research
(9) Cultural variability and sensitivity Treatment accounts for differences in SES, gender, language, intellectual abilities, cultural characteristics. Thirteen elements of EBP AP Research
Treatment/ Intervention fidelity Clear treatment manual available documenting well-defined and previously tested treatment/ intervention practices, testing procedures in place to verify maintenance of intervention procedures Thirteen elements of EBP AP Research
(11) Background literature support Building off of at least two highly similar control group studies or a large series of single-case study designs (e.g., more than 30). Thirteen elements of EBP AP Research
(12) Replication Treatment Program has been replicated different sites different populations Thirteen elements of EBP AP Research
(12) Replication Treatment Program has been replicated different sites different populations Thirteen elements of EBP AP Research
(13) Length of treatment effectiveness “Gold standard of one year Greater length possible? If not one year, then six months? 30 days? Thirteen elements of EBP AP Research
What do we have to do to change the AP field in EBP research? Get people in programs interested in the value of EBP at the level they’re at Get on lists Defend aggressively against poor research Learn from the AEE Accreditation Program
What do we have to do to change the AP field in EBP research? (5) Grow our own and see them as long term investments, knowing there will be “attrition” along the way. (6) Attract external researchers to conduct “informed and powerful” research on adventure programs (7) Funding
What do we have to do to change the AP field in EBP research? (8) Motivate the field to follow - - increased awareness, - enhanced and rapid decision making WILL happen, - preparations will not be a choice, - set up a supportive and renewing action cycle
What do we have to do to change the AP field in EBP research? (9) Create “teams of success” - researchers - funders - programmers (10) Current efforts follow-up
What’s being done? • CORE established by AEE • REAP as the “roof builders” • REAP Conference in Santa Fe on March 19-21, 2008 • Data archive funded by NATSAP
NATSAP Data Archive • Collect aggregate outcome information helping NATSAP communicate the nature and quality of member programs’ work to stake holders (Auchenbach Youth Self report and Child Behavior Checklist) • Allow programs to access clinically relevant assessment info for treatment planning purposes • Allow programs access to their own de-identified aggregate information for quality improvement • Organize the data into an accessible archival database for additional research projects
What stage of “buy in” for EBR by adventure programs are you in? • Awareness stage – don’t know what it is, unaware of the benefits, or the controls dictated by EBP • Decision-making stage - weigh pros and cons, but remain vague about actually making changes or choosing for the pro side • Preparation stage – make a decision to implement this process, generated by a “value added” approach of sorts from a desire to have a more effective program or financial reasons • Action stage – partner support structure in place to aid continuation