290 likes | 294 Views
Explore the innovative approaches, different perspectives, and design challenges in education evaluation, with a focus on the impact and process evaluations, rigorous methodologies, and the importance of user-centered design. Learn about evaluation in education compared to other fields and the strategies used to maximize effectiveness and efficiency.
E N D
Evaluation in Education: 'new' approaches, different perspectives, design challengesCamilla NevillHead of Evaluation, Education Endowment Foundation 24th January 2017camilla.nevill@eefoundation.org.ukwww.educationendowmentfoundation.org.uk @EducEndowFoundn
Introduction • The EEF is an independent charity dedicated to breaking the link between family income and educational achievement. • In 2011 the Education Endowment Foundation was set up by Sutton Trust as lead charity in partnership with the ImpetusTrust.The EEF is funded by a Department for Education grant of £125m and will spend over £220m over its fifteen year lifespan. • In 2013, the EEF was named with The Sutton Trust as the government-designated ‘What Works’ centre for improving education outcomes for school-aged children.
The EEF Our approach Two aims: 1. Break the link between family income and school attainment 2. Build the evidence base on the most promising ways of closing the attainment gap
The Teaching and Learning Toolkit • A meta-analysis of education research • Contains c.10,000 studies • Cost, impact & security included to aid comparison
EEF, March 2016 7,500 schools currently participating in projects 133 projects funded to date 750,000pupils currently involved in EEF projects £220mestimated spend over lifetime of the EEF 26independent evaluation teams £82m funding awarded to date 66published reports
New approach, different perspectives, design challenges • Design with the end user in mind • There is no one right answer – communicate and compromise
New approach Rigorous, independent evaluations • Evaluate projects Independent evaluation Longitudinal outcomes Impact and process evaluations Robust counterfactual (RCTs)
Education v other fields How does this compare to evaluation in your field?
Main messages • Design with the end user in mind • There is no right answer – communicate and compromise
Process for appointing evaluators Grants team identify projects, 1st Grants Comm. shortlist Teams submit 8 page proposal Teams chosen to submit proposal Teams chosen to evaluate projects Evaluation teams receive 1page project descriptio-ns Teams submit 2 page EoI First set-up meeting with evaluation team, project team and EEF 2ndGrants Comm. shortlist 2nd set up meeting with evaluation team, project team and EEF Finalise evaluation design. Decide on eligibility criteria, details of protocol, process evaluation measures linked to logic model Share understanding of intervention logic. Decide overall design, timeline , sample size, control group condition. Developer (& evaluator) budgets set
Different perspectives EEF Evaluator Set-up meeting Developer
Different perspectives EEF Useful results Quick results Keep costs down Evaluator Publications Funding to do research Personal interests Set-up meeting Developer Funding to deliver programme Demonstrate impact Good relationships with schools Publications?
Design challenges Improving Working Memory • Teaching memory strategies by playing computer games • For 5 year-olds struggling at maths • Delivered by Teaching Assistants • Developed by Oxford University educational psychologists • Evidence of improvement in WM from two small (30 and 150 children) controlled studies
Design challenges How many arms? • Working Memory (WM) • WM blended with maths • Matched time maths support • Business as usual (BAU)
Design challenges When would you randomise? Deliver programme (10 hours) 121 support for 20-30 mins for total 5 hours Computer games for 5 hours Maths attainment Improved working memory Identify pupils (bottom 1/3) School Recruited Identify TAs and link teacher One-day training for TAs Oxford University
Design challenges Delivery log Deliver programme (10 hours) Survey, observations, interviews Maths test WM test • 121 support for 20-30 mins for total 5 hours Computer games for 5 hours Maths attainment • One-day training for TAs • Identify pupils (bottom 1/3) Improved working memory School Recruited • Identify TAs and link teacher • Oxford University Randomisation
Design challenges Catch Up Numeracy • For 4 to 11 year-olds struggling at maths • Delivered by Teaching Assistants • 10 modules of tailored support • Flexible delivery model (no fixed length) • Evidence from EEF pupil-randomised efficacy trial:
Design challenges What control group would you use?
Design challenges Catch Up Numeracy 150 schools Recruited Identify TAs and ~8 children in years 3-5 behind in maths Randomise 75 schools, 600 children: Flexible Catch Up delivery model 75 schools, 600 children: Business as usual control group Follow up maths test
Problems with interpretation What if we see no effect of Catch Up and control group gets lots more support? What if we see a big effect of Catch Up and the control group has received lots less support?
Design challenges Boarding school • Children in need at risk of going into care • Referred by Local Authorities Teenage Sleep • Changing school start times to later • Positive effects from US trials (8am start v 11am start)
Main messages (and sub-messages) • Design with the end user in mind • Test the right intervention • Make sure your comparison is relevant • Measure implementation and cost • There is no right answer – communicate and compromise • Use logic model to understand the intervention • Pre-specify the interpretation to aid decision making • Not all interventions can be randomised
Thank youcamilla.nevill@eefoundation.org.ukwww.educationendowmentfoundation.org.uk @EducEndowFoundn
Measuring the security of trials • Summary of the security of evaluation findings • ‘Padlocks’ developed in consultation with evaluators • Five categories – combined to create overall rating: