230 likes | 345 Views
Update on the Next Generation of GEAR UP Evaluation Studies. 2010 NCCEP/GEAR UP Capacity Building February Meeting. Two Topics --Update. Background—Why Did we Take this Approach to Evaluation? Project in Context of New Directions--Update on ED Plans.
E N D
Update on the Next Generation of GEAR UP Evaluation Studies 2010 NCCEP/GEAR UP Capacity Building February Meeting
Two Topics --Update • Background—Why Did we Take this Approach to Evaluation? • Project in Context of New Directions--Update on ED Plans
NCES Data on Percent of High School Students with at least one parent with a BA Degree (Data from NLSY and ELS:2002)
Current National Activities Priorities • Increasing the high school graduation rate, particularly in the “dropout factories” that have particularly low rates • Implementing early warning systems at the middle school level • Using college pathways as a dropout prevention strategy • Supporting efforts to turn around low-performing high schools • Using data on college-going and retention rates to improve high school performance • Improving college matriculation and completion rates • Supporting efforts to enable more students to attain at least an associates’ degree and to improve community colleges • Providing incentives to IHEs to increase their enrollment of low-income students • Providing incentives to IHEs to keep their costs down
GEAR UP First Evaluation • Evaluations mandated in authorization in 1998 • Westat began 10 year evaluation soon after program authorized—following cohort of 2000-01 7th graders • Quasi-experimental—matched comparison • Study analyses and reporting be concluding next year—Middle School report in 2007 • High School descriptive report in 2010
Challenges and Issues From First Evaluation • Long period to obtain results/ study attrition • Serious Threats to validity (spill over; treatment/comparison non-equivalencies) • Black box study not telling us about specific practices • Not strong focus on program improvement
Cross Study Evaluation Issues • Selection effects issues—voluntary program is interaction between motivation and opportunity • Difficulty separating federal service from other services—contamination of comparison group issues • Difficulty linking specific services of practices to impacts—black box—most had efforts to do so • Grantee engagement issues—critical of accuracy and questioning value and resources take away from service delivery—yet also very committed to program improvement and evaluation methods • Length of time issues—follow ups take time • Duration and Intensity—more services and longer—larger effect sizes—selection effects
Lessons Learned not Just From GEAR UP but Other Evaluations of ED • Pay attention to sampling and non-sampling errors—evaluate the evaluations • Must ask question—Is the state of the art of evaluation as we practice it really capable of being the source of differentiating budget allocations? • Zero Sum Issues---How to get around the “zero sum” game in which projects are competing with each other for scarce resources—efforts to game the system--- • Stakeholder Involvement critical for program improvement---Partnership ----Focus on how programs can work together and contribute to the shared goals—formative assessment • Is non-punitive accountability possible ? • Strength Finder Approach as opposed to focusing on deficits--Possible to work from strengths of program rather than weaknesses
Current Models in Evaluation Profession • Partnership –engage practitioners—formative assessment—evaluation as tool for improvement—using data to improve • Utilization Focused—Users and Client focus--Continuous Improvement—evaluation • Standards based (feasible, accurate, ethical, transparent, useful) • Ex Ante—Theory of change—what impact can reasonably expect given the intervention and given the system—multiple methods • Systems theory—role in contributing to the whole--interactions • Complexity theory—changing conditions-adaptation needed—away from summative—never repeat same situation exactly—rapid time feedback for adaptation and innovation
New Approach to be Taken • Design next generation studies that will contribute to program improvement • Rigorous and Evaluation and Statistical Standards Based • Responsive to Congress and Accountability Efforts such as PART • Build Capacity for projects to be learning organizations --responsive to the needs of practitioners and students served
What would we like the next generation of GEAR UP evaluations to look like? • Partnership (key stakeholders) • Standards Based (feasible, accurate, useful, ethical) • Practice (Grounded in understanding) • Reflection (analysis and synthesis of information from multiple sources-formative assessment) • Innovation/Improvement (start and end)
Standards • Useful—to interested parties (congress, policy makers, practitioners, and ultimately students that the program is intended to benefit • Accurate--Scientific Rigor—evidence based • Feasible—possible to implement study • Proper—meet ethical standards for evaluation work and consistent with IRB
Rigorous Evaluation Design Requirements • Counterfactual/comparison • Sample representative –external validity • Treatment and control or comparison group are equivalent on dimensions related to outcomes • Treatment and control/comparison group are treated equally except for intervention of interest • Treatment and control are mutually exclusive with regard to the intervention
HEOA Reauthorization in July 2008 will be in future studies • GEAR UP reporting include separate analysis for • The implementation of the scholarship component • Use of methods for complying with matching requirements
TRIO HEOA Evaluation Amendments • Prohibit ED requiring projects to deliberately recruit more students than they would normally serve and then denying service for study purposes • Call for working with the applicable institutions’ IRB’s • Call for rigorous studies focused on program improvement and addressing who can most benefit
GEAR UP Priority Areas • Promising interventions –Bridge from 8th to 9th grade; focus on increasing 9th grade success • Supplemental support for math achievement • Graduating college ready students who do not need remediation—first year success--
GEAR UP Design Work • How can we develop designs that are: useful, rigorous, feasible and proper? • What would such designs look like? • How should they best be structured for implementation? • How can we make the best use of the funds for evaluation provided for by Congress?
Where are we in process? • Met with GEAR UP Project Director’s in Feb of 2008—Feb 2010—5 times • RTI Background work compendium (profiles, systemic review, focus groups, expert papers)—Summer 2010 • Initiated Planning Awards Solicitation for GEAR UP projects—44 awards made • Implementation awards—summer 2010
Contact info. • margaret.cahalan@ed.gov • jim.maxwell@ed.gov • Sandra.furey@ed.gov