210 likes | 330 Views
Evaluation Design Challenges on the East Coast: A Tale of Two Drug Courts. American Evaluation Association November 3, 2006 Kimberly Pukstas, PhD. Research Team. Jodi Brekhus, MS Shannon Carey, PhD Dave Crumpton, MPA Michael Finigan, PhD Bob Linhares, MA Juliette Mackin, PhD
E N D
Evaluation Design Challenges on the East Coast: A Tale of Two Drug Courts American Evaluation Association November 3, 2006 Kimberly Pukstas, PhD
Research Team • Jodi Brekhus, MS • Shannon Carey, PhD • Dave Crumpton, MPA • Michael Finigan, PhD • Bob Linhares, MA • Juliette Mackin, PhD • Kimberly Pukstas, PhD • Judy Weller, BS
Research Aims Two juvenile drug courts in Maryland were selected: • Process Evaluation - What are the policies and procedures? • Outcome Evaluation – What were the results? • Cost Evaluation – What were the costs?
Drug Court #1 Inner City Setting Began 1997 Capacity: 200 Marijuana 95% Male 99% African American Drug Court #2 Suburban Setting Began 2000 Capacity: 30 Marijuana 75% Male 88% Caucasian Study Sites
Early Concerns • Drug Court # 1 – lack of variability among participants • Drug Court # 2 – small sample sizes • Drug Court # 2 – younger drug court program may still be in flux
Drug Court #1 11key stakeholders interviewed 1 focus group held with parents (n=3) 2 focus groups held with juveniles (n=10) Drug Court #2 10 key stakeholders interviewed 1 focus group held with parents (n=5) 1 focus group held with juveniles (n=12) 1 individual interview w/ discharged youth Process Evaluation
Process Evaluation Challenges • Overall challenges were similar for both drug courts • Program staff busy, distracted • Difficulties recruiting discharged youth • Difficulties recruiting parents (court #1) • Participants in different stages attending same focus groups • Location – need privacy, comfort
Process Evaluation Solutions • Increase timeline for process evaluation • Improve communication – point person • Allow for phone interviews when focus groups are not possible • Create more meaningful incentives for participants • Delegate more responsibilities to regional employees
Drug Court #1 Program had undergone recent challenges Frustration apparent but also a strong commitment to the program Less adherence to the 10 key components Drug Court #2 Staff had anxiety about the evaluation Strong staff commitment to the program observed Greater adherence to the 10 key components Process Evaluation Results
Drug Court #1 Lotus (?) Program Hard Copy Reports Untrained Staff 6 variables consistently entered Drug Court #2 Access Program Password Protected CD Highly Trained Staff Over 125 variables consistently entered Outcome Evaluation
Drug Court #1 Sex, Race Name,DOB ??? ??? ??? ??? Start Date, Status Drug Court #2 Demographics Personal Identifiers Baseline Health/MH School/Employment AOD History Family History Program Participation Outcome Evaluation
Outcome Evaluation Solutions Drug Court #1 • Arrange Site Visit • Obtain Paper Case Files • Create Electronic Database for Research Purposes • Transfer Data from Paper Files to Electronic database
New Concerns Drug Court #1 • No Central MIS System • Lots of Missing Records • Key Variables Missing • Several Different Forms in Use • Inconsistent Data Entry • Program is in Period of Upheaval • Program Functioning Far Below Capacity
Evaluation Solutions Drug Court #1 • Prioritize Process Evaluation • Recommend TA for program staff • Develop a corrective action plan • Postpone Outcome & Cost Evaluation
Outcome Evaluation Drug Court #2 – Results Searched: • Drug Court Database • Juvenile Justice Database • Adult Criminal Justice Database • Treatment Database • Local Detention Center Data
Outcome Evaluation: Results * p<.05
Cost Evaluation Drug Court #2 • More than 25 Interviews Conducted • 21 Transactions Identified • Transactions Include: Drug court appearances, Case mgt, drug testing individual & group sessions, mh evaluations, educational counseling
Cost Results Drug Court #2 • Ave program cost per drug court participant is $11,689($41 per person per day) • Ave cost of criminal justice system outcomes in the year following program involvement was 60% less than the cost of the comparison group ($3,409 vs. $8,481)
Lessons Learned • Expect the unexpected!!! • Capacity does not equal sample size • Be prepared to alter your evaluation strategy • Missing data are still an interesting research finding • Make the most of the data that are available
Questions? • Email Dr. Kimberly Pukstas: pukstas@npcresearch.com • Final Reports are available: www.npcresearch.com