200 likes | 330 Views
Formative Evaluation of the Defending Childhood Initiative. Michael Rempel, Melissa Labriola, Rachel Swaner, Kathryn Ford, and Julia Kohn Center for Court Innovation Peter Jaffe and Marcie Campbell Centre for Research on Violence Against Women & Children David Wolfe
E N D
Formative Evaluation of the Defending ChildhoodInitiative Michael Rempel, Melissa Labriola, Rachel Swaner, Kathryn Ford, and Julia Kohn Center for Court Innovation Peter Jaffe and Marcie Campbell Centre for Research on Violence Against Women & Children David Wolfe Centre for Addition and Mental Health Presented at the Defending ChildhoodInitiative Grantee Meeting, Washington, D.C., January 25, 2011
Organization of This Presentation • Evaluation 101 • The Defending ChildhoodFormative Evaluation • Evaluability Assessment
Evaluation 101 • Process Evaluation • Impact Evaluation • Cost Analysis • Action Research
Process Evaluation • Definition: Describes planning process, operations, and outcomes for project participants • Components: • Qualitative • Quantitative • Fidelity Analysis
Process Evaluation (continued) • Qualitative: Description of: • Project goals and objectives • Planning – team members, needs, decisions, challenges • Operations – all elements of the final project model • Quantitative: Data on: • Participant baseline characteristics and service needs • Treatment dosage (e.g., days/sessions attended of each service) • Community outreach (schools, workshops, public events, etc.) • Fidelity Analysis:Did practices mirror intended model?
Process Evaluation (continued) • Formative Evaluation: • Focus on planning process or early operations • More qualitative than quantitative • Participatory Evaluation: engages project planners or staff in defining evaluation scope and content
Impact Evaluation • Definition: Tests project impact in achieving its goals; virtually always requires a comparison condition • Experiment: random assignment to conditions • Quasi-Experiment: naturally occurring comparison: • Pre-Post: before vs. after a project started • Contemporaneous: e.g., not enrolled for logistic reasons • Comparison Site: nearby neighborhood/jurisdiction • Non-Experiment: not valid: • Completers versus Dropouts • Participants Only: Before vs. After Participation
Other Types of Evaluation • Cost Analysis: often of great interest to policymakers • Action Research: • Provides immediate and useful feedback about everyday program operations and performance • Minimal or no technical research expertise required • Typically involves tracking simple performance indicators with forms, spreadsheets, or simple databases (e.g., Access)
This Evaluation • Phase One: Formative (process) evaluation only • Phase Two: Process, impact, and cost evaluation, focusing on four (4) sites
Looking Ahead: Challenges for Phase Two • How conduct impact evaluation: • On a “package of strategies”? • On public awareness strategies not targeted at a specific program participant group (vs. specific comparison group)? • On strategies whose effects may not been seen for years? • Solution: must combine rigorous quantitative analysis, comparison condition(s), and alternative methods (rich observation, case studies, focus groups, etc.)
Goals of the Evaluation • Implement participatory research process • Conduct formative evaluation • Identify outcomes and perform data assessment • Produce evaluability assessments and Phase II evaluation design
Ecological Framework • Level of the Ecological Framework: • Societal Level • Community Level • School Level • Inter-Intrapersonal Level • Type of Strategy: • Prevention • Intervention • Public Awareness
Goal One: Participatory Research Process • Literature Overview: Engage sites with relevant findings concerning the prevalence, effects, and existing strategies to address CEV • Mapping Goals and Strategies: Understand each site’s process of identifying goals, strategies, and outcomes • Logic Model: Develop comprehensive logic model for each site, linking goals to strategies to desired outcomes, through an iterative, consultative process
Goal Two: Formative Evaluation • Multi-Agency Collaboration: Document persons, agencies, roles and management of each site’s initiative • Problems and Needs: Detail each site’s assessment of the local CEV problem and current unmet needs • Policies and Strategies: Provide rigorous account of each site’s prevention, intervention, and/or public awareness strategies (across the Ecological Framework) • Barriers: Describe each site’s barriers and resulting problem-solving methods or policy modifications
Goal Three: Outcomes and Data Assessment • Outcome Identification: Finalize site-specific and cross-site outcomes and performance indicators relating to chosen prevention, intervention, and public awareness strategies • Data Assessment: Assess existing information systems and future data collection needs in each site
Goal Four: Deliverables to NIJ • Eight Evaluability Assessments • Four-Site Phase II Evaluation Design
Evaluability Assessments Outline • Project Summary • Data • Evaluability
I. Project Summary • Local CEV Problem • CEV rates/problems • Status quo resources and assets • Status quo gaps and service needs • Site Defending Childhood Initiative • Structure of the initiative/collaborative • Logic Model • Description of strategies and scale
II. Data • For each intermediate goal: • Data needs • Existing solutions • Planned solutions
III. Evaluability Strengths, weaknesses, opportunities, and threats • Collaboration • Policy formalization • Volume (by year) • Local research capacity • Evidence-based practices • Sustainability and additional resources • Data capacity and gaps • Comparison Conditions