190 likes | 328 Views
California Probation Parole and Correctional Association 77th Annual Training Conference. SB81/ROCK What Makes Evaluation So Important?. September 18, 2008 Prepared by Urban Strategies Council Evaluation Team Steve Spiker, Director of Research Bill Heiser, Research and Program Associate
E N D
California Probation Parole and Correctional Association 77th Annual Training Conference SB81/ROCKWhat Makes Evaluation So Important? September 18, 2008 Prepared by Urban Strategies Council Evaluation Team Steve Spiker, Director of Research Bill Heiser, Research and Program Associate Junious Williams, CEO www.urbanstrategies.org
Outline • Presenter Biographies • The Council • What is Program Evaluation • Why Evaluate? • The ROCK Program • Evaluation Outcomes
Urban Strategies Council • Non-profit founded in 1987 in Oakland, California • Mission is the elimination of persistent poverty by building vibrant, healthy communities • Operating Programs: • Economic Opportunity • Education Excellence • Community Safety & Justice • Support Programs: • Research & Technology • Community Capacity Building
Community Safety & Justice Current Projects • Alameda County Reentry Network (www.acreentry.org) • Reentry Health Gap Survey • Oakland Community Policing • Oakland Crime & Homicide Analysis • Alameda County Violence Prevention Initiative • Community Service Gateway Reentry Project Past Projects • Reentry Health Task Force • Richmond Violent Crime Analysis
What is Program Evaluation • Formalized approach to studying processes and impacts • Include both quantitative and qualitative methods • In more recent years evaluation focuses on utility, relevance and practicality and less on scientific validity • Focus points: • What do you need to know? • How can it be applied?
What is Program Evaluation • 5 dimensions: • Needs Assessment (Why do an evaluation, what must we know?) • Program Theory (How does the program work?) • Process Analysis (Formative Evaluation) • Impact Analysis (Goals met?) • Cost-Benefit & Cost-Effectiveness analysis (Is what we did scalable beyond a pilot?)
Why Evaluate? • Program improvement • Comparison between programs • Demonstrate measureable impact of new programs • Decision making- are there particular groups who will most benefit? • Independent verification of program suitability and success
Evaluation Considerations • Why are we thinking of an evaluation? • Who is the evaluation for? (Audience) • What type of information will we need? (Data) • Goals, processes or outcomes? • Evaluators can help improve program design • Evaluators can assist in building skills, knowledge and abilities of staff • Include them as part of your project team, not as outsiders
Evaluation Issues • Are you asking the right questions? • There may be multiple “right” answers • Is there a broader context you need to consider? • Need to include stumbling blocks and failures! • Consider assumptions in model/theory • How well can we generalize our results?
When to hire an evaluator? • Ideally at the proposal-writing stage! • Preferably before you start providing services • Never wait till the first annual reports are due!
SB81/ROCK Evaluation • Formative Evaluation • Data Collection & Instrument Design • Data Analysis • Causation & Recommendations
Population Characteristics • Total probationers 18-24 in Oakland/Hayward: 264 • 228 Male (84%), 36 Female (14%)
SB81/ROCK Evaluation • Performance goals to be measured: • Enrollment in recommended programs and services; • Participation in the required or recommended services; • Acquisition of the intended knowledge or skills; • Completion of the program; and • Achieving the intended purpose of the program such as GED, employment, cognitive behavioral change, effective anger management, reduced and more effectively managed conflict, etc.
Formative Evaluation • 6 Months in length • Refining program definitions, hypotheses and procedures • Finalize specific outcome and results measures • Clarify and validate selection procedures • Baseline data analysis
Process Documentation • In order to evaluate a new program we must know (i.e. document) the following: • Procedures and processes: what happens in certain situations • Referrals: how, when and why they are made • Theory of change- why each part will have an impact • Contextual and environmental conditions affecting the program implementation and outcomes
Data Collection • Full probation population demographics • LS/CMI Assessment scores • Referrals: when, by whom, why and how well they worked • New arrests and convictions • Probation violations, revocations • Program removal- moving out of county, death • Surveys from pre/post cognitive behavioral classes • Interviews with DPOs on program implementation and processes
Evaluation Outcomes • Causation: are the outcomes for the target group statistically different from those in the comparison group? • Can we definitively say that this model of services reduces recidivism to state prison? • If yes, then how well does this model suit the general probation population in Alameda County (or the State?)
Follow Up • For more information on: • The ROCK evaluation.. • Other evaluation services.. • Mapping services or spatial analysis.. • Analytical services.. • Program design.. Please contact us: Steve Spiker, Research Director steves@urbanstrategies.org or 510-893-2404