170 likes | 285 Views
Janeen Buck Jeffrey Butts October 23, 2002. National Youth Court Center Evaluation Workshop. National Youth Court Seminar on Funding and Evaluating Teen Courts - Indianapolis, IN. Introductions . Name Title, program name Home town (where you grew up) One thing you want to learn today
E N D
Janeen Buck Jeffrey Butts October 23, 2002 National Youth Court CenterEvaluation Workshop National Youth Court Seminar on Funding and Evaluating Teen Courts - Indianapolis, IN
Introductions • Name • Title, program name • Home town (where you grew up) • One thing you want to learn today • Best TV series of all time (no repeats!)
Seminar Outline • Morning Sessions • 1.0 evaluation overview • 2.0 logic models • 3.0 evaluation approaches/research designs • Afternoon Sessions • 4.0 findings from OJJDP evaluation of teen courts • 5.0 data collection and measurement • 6.0 selecting an evaluator • NYCC evaluations
1.1 Evaluation – What Is It? “Systematic method for collecting, analyzing, and using information to answer basic questions about a program” - The Program Manager’s Guide to Evaluation. Administration on Children, Youth and Families (ACYF), U.S.Department of Health and Human Services (http://www.acf.dhhs.gov/programs/core/pubs_reports/prog_mgr.html)
1.2 Why Evaluate? • Answer questions about effectiveness • Improve program operations • Compete for funds • Determine how to allocate resources • Accountability
1.3 For Whom? • Key stakeholders - those who share an interest or “stake” in the program and its success • Identify key stakeholders, include them in the process
1.4 When to Evaluate? • The “what” and “why” are known • Purpose is clear • Information exists • Resources available • Sufficient degree of program readiness
MediatingFactors Background Factors InterventionFactors Outcomes • Factors Not Directly Related to Intervention, But Which Could Influence Outcomes • Local context • Unanticipated program effects • Unexpected client behavior Intended Result 1.Intermediate 2.Long-term • Client Factors • Demographics • Family • School • Peer Influences • Pro-social Attitudes • Delinquency • Substance Abuse • Neighborhood Factors • Program Components and Activities • What does the program actually do? • How? • When? • Where? • To whom? 2.0 Logic Models
2.0 Logic Models – Group Exercise #1 MediatingFactors Background Factors InterventionFactors Outcomes
3.0 Evaluation Strategies • Process • Performance Measurement • Impact • Cost Analysis
3.1 Process Evaluations • Focus: how programs evolve and operate • Advantages: • narrative of development • lessons learned • rich context • Disadvantages: • anecdotal evidence of effectiveness
3.2 Cost Analysis • Focus: assessing the costs (real/abstract) • Advantages: • Yields information of interest to funders • Disadvantages: • Difficult, time-intensive • Results may be difficult to interpret and apply
3.3 Performance Measurement • Focus: regular feed-back informs improvement, fosters accountability • Advantages • Early identification of problems, facilitates improvements • Disadvantages • Can be burdensome if too many indicators used
3.4 Impact Evaluation • Experimental • Quasi-Experimental • Non-experimental
3.4a Experimental Designs • The “gold standard” • Random assignment (treatment/control groups) • Causality can be attributed to program • Advantages: • definitive findings • Drawbacks • impossible in many program settings • ethical considerations • expensive
3.4b Quasi-experimental design • Comparison groups, but no random assignment • Advantages: • detect change in outcomes between two groups • Drawbacks: • can’t attribute outcomes to intervention only
3.4c Non-Experimental Designs • No comparison groups (e.g., repeat measurements, staggered start-stop) • Advantages • can detect changes related to intervention • Disadvantages • can’t attribute change to intervention only