200 likes | 351 Views
Using evidence to improve young lives. Policymaking and evidence: how can we improve the fit?. Sonia Sodha Head of Policy & Strategy The Social Research Unit at Dartington. Who we are. Small research-based charity of around 20 staff, based in Dartington , Devon and London Our mission:
E N D
Using evidence to improve young lives Policymaking and evidence: how can we improve the fit? Sonia Sodha Head of Policy & Strategy The Social Research Unit at Dartington
Who we are • Small research-based charity of around 20 staff, based in Dartington, Devon and London • Our mission: • Understanding what works in improving children and young people’s outcomes. • Helping people – central government, local commissioners, trusts and foundations, service providers – apply that evidence in the real world. • We seek to fill the gap between evidence as its produced and something that people can use in real-world decisions.
Many different stakeholders influence how and whether evidence gets used central government local government trusts & foundations EVIDENCE service providers private investors practitioners consumers of public services
Who are the ‘policymakers’? Central government – decisions about systems Commissioners – deciding how money gets spent on what services Service providers – deciding how to structure services Practitioners – interacting with service users
What is evidence? • Evidence of impact- whether we know something works • Evidence about needs – what’s the issue services are trying to address? • Evidence about what affects educational outcomes • Evidence about how people feel about a service
What are the barriers to greater use of evidence in decision-making? • Political and institutional incentives
What are the barriers to greater use of evidence in decision-making? • Political and institutional incentives 2. The ‘what works’ test is not a simple one
1. How do we tell if something has impact? Our ‘what works’ standards of evidence have four dimensions: • Intervention ‘specificity’ • Public service ‘system readiness’ • Evaluation quality • Impact But these are standards that are designed to apply to programmes rather than other features of public service design, like system reform policy, practice and processes What’s being evaluated – design and implementation Evaluation of impact
2. It’s not easy to prove impact – so the evidence base is limited INNOVATION IMPACT pre/post outcomes monitoring experimental evaluation theory of change Not an either/or between innovation and evidence.
41% of children with a few problems (n = 6,318) 53% of children with no problems (n = 8,166) 6% of children with many problems (n = 925) Total 9-18 population of Renfrewshire: n = 15,409 3. Evidence of impact is not enough Problem counts across Renfrewshire 9-18 yrs
7% of children involved in at least one system (n = 1,079) 41% of children with a few problems (n = 6,318) Total 9-18 population of Renfrewshire: n = 15,409 38% 86% of those with many problems have no service contact 12% 53% of children with no problems (n = 8,166) 50% 6% of children with many problems (n = 925) 38% of system-involved have no problems (5% of total ‘no problem’ pop.) 50% of system-involved have have a few (9% of total ‘few problems’ pop.) 12% of system-involved have havemany (14% of total ‘many problems’ pop.) Reach of Systems: Combined service contact 9-18 yrs
4. How do we replicate and scale? Three challenges to replicating and scaling what works: • Does what works in one context work in another? • How do you replicate what makes a programme or practice effective again and again? Evidence-based programmes are one vehicle but they will have limited reach. How would you scale an evidence-based practice?
Evidence rarely gives the black and white answers policy-makers expect
What are the barriers to greater use of evidence in decision-making? • Political and institutional barriers 2. ‘What works’ is not a simple test 3. The link between policy and practice is distal
What can we do to break down some of the barriers? • Depoliticisation? Are there ways of depoliticising some decisions and lengthening time horizons? Can politicians and policymakers build in safeguards? Eg the Educational Endowment Fund
What can we do to break down some of the barriers? • Depoliticisation? 2. System reform
2. System reform Can systems be reformed to better promote use of evidence in policy? Important levers: • Accountability • Funding: subsidy, matched funding for local budgets • Funding of knowledge generation and dissemination – better tying the two together
What can we do to break down some of the barriers? • Depoliticisation? 2. System reform 3. Embedding understanding of evidence into professional development
What can we do to break down some of the barriers? • Depoliticisation? 2. System reform 3. Embedding understanding of evidence into professional development 4. Commission better evaluation that contribute to learning
Using evidence to improve young lives Policymaking and evidence: how can we improve the fit? Sonia Sodha Head of Policy & Strategy The Social Research Unit at Dartington