230 likes | 409 Views
Researching and evaluating equity initiatives Evaluating WP Initiatives: Overcoming the Challenges Open University, 28 February 2019 Annette Hayton, University of Bath. Accountability, context & impact.
E N D
Researching and evaluating equity initiatives Evaluating WP Initiatives: Overcoming the Challenges Open University, 28 February 2019 Annette Hayton, University of Bath
Accountability, context & impact • Research has increased understanding of reasons for low participation and attainment of under-represented groups but: • Descriptive – not focussed on making a difference • Often not disseminated to practitioners or policy makers • Hasn’t informed planning, evaluation and monitoring • Monitoring for OfS, SMTs and Government has focussed on: • value for money • demonstrating the effectiveness of WP interventions • Practitioner research/evaluation has focussed on: • the successful delivery of activities • reporting to OfS, funders and SMT
Accountability, context & impact BUT efforts for accountability, ‘rigour’ & comparability OFTENresult in simplistic approaches to evaluationbased on medical models LOSE SIGHT of underlying reasons for inequalities OF CONTEXT & complexity of successful interventions Pressure to demonstrate: • the effectiveness of interventions and value for money • Not unreasonable !
Accountability, context & impact : Picciotto warns against the ‘lure of the medical model’, ‘Experimental black boxes are poorly suited to the evaluation of complicated or complex programmes in unstable environments’ (Picciotto, 2012: 223) a field called translational science has been invented to concentrate on bridging laboratory finding with clinical experience.’ . Fendler (2016) ‘RCTs often premised on students having a problem or ‘symptoms’ that require treatment … these students are pathologised first by naming their problem (often expressed in terms that match the solutions at hand) and then by being treated with an intervention by some external agency or person. (Gale, 2017: 4)
Accountability, context & impact How can we assess effectiveness of interventions? ‘what works’ is a matter of judgement rather than data, and that this judgement is imbued with moral and ethical concerns’ Morrison, 2001: 79). Copestake argues for measurement based on the notion of ‘reasonableness’, involving a range of stakeholders ‘This falls short of scientific certainty, but in complex situations it is often as much as we can hope for ...... to aim higher may be counterproductive in terms of cost, timeliness and policy relevance.’ (Copestake, 2014: 417) Decade-long debate within the Development Evaluation community Reachedan uneasy consensus that a mixed-methods approach was required. Picciotto’s (2012: 215–16)
Accountability, context & impact Widening participation work is, or at least should be, based on the personal. .... in which young people are enabled to make choices and decisions, develop strategies and goals, plan for their futures, and are motivated, inspired and empowered. (Hayton and Stevenson 2018). Nygaard and Belluigi (2011) argue that: decontextualized approaches to evaluating learning and teaching are rooted in a static conception of learning ....... more creative and flexible pedagogies are required and a contextualized model of evaluation that ‘stress that relations between individual and fellow students, teachers, administration are determined by context
Mixed Methods Monitoring Progression outcomes ? Impact of activities Process Evaluation
Understanding the challenge Theories of change • Important for the sector to move beyond descriptive research to action • BUT • Understanding the processes involved in bringing about the change Theories of change currently being presented too simplistic and linear • Defining your interventions • Determining the impact of your work
Effective theory of change • Aims for interventions informed by theory, research and practice • Interventions reflect the aims • Appropriate methods used to generate useful data • Evidence to demonstrate impact and inform practice and theory
The NERUPI Framework Designed to maximise the impact of Widening Participation interventions providing: • a robust theoretical and evidence-based rationale for the types of intervention that are designed and delivered • clear aims and learning outcomes for interventions, which enable more strategic and reflexive design and delivery • an integrated evaluation process across multiple interventions to improve data quality, effectiveness and impact
Key theoretical influences Nancy Fraser on social justice Sen and Walker’s concepts of capability Yosso cultural wealths Identities and possible/future selves Young and Maton’s ideas of knowledge Critical pedagogies
Making a difference praxis • Theory & academic research – quantitative and qualitative • Practice • Policy reflection and action directed at the structures to be transformed Paulo Freire 1968
Why do some students do well? • Economic capital • Social capital -who you know • Cultural capital – what you know Pierre Bourdieu Resource differences and collective efforts and investments made or not within families become translated into individual ‘ability’........ (Ball 2010, p.162).
Action research reflective cycle for WP ANALYSIS theory - OfS policy – local context - data - knowledge PLANNING aims - targeting - interventions - evaluation strategy- logisitics COLLECT DATA Monitoring – tracking – related stats – process - impact ACTION Deliver the interventions ANALYSIS Cycle repeats
NERUPI Framework • A set of Aims and Objectives for interventions informed bytheory, research and practice • Can encompass specific intervention-based aims • A common language for planning and reporting • Choice of appropriate methods according to context of intervention • Evidence to demonstrate impact and inform practice and theory
Find out more: www.nerupi.co.uk NERUPI Members Event 11 March 2019 The Capability Approach: Beyond the Deficit Model for Student Success NERUPI Open Event 14 June 2019 Introduction to the NERUPI Framework
References • Fraser, N. (2003) ‘Social justice in the age of identity politics: Redistribution, recognition, and participation’. In Fraser, N. and Honneth, A. Redistribution or Recognition? A political- philosophical exchange. Trans. Golb, J., Ingram, J. and Wilke, C. London: Verso, 7–109.31 Praxis-based frameworks • Freire, P. (1972) Pedagogy of the Oppressed. Trans. Ramos, M.B. Harmondsworth: Penguin Books. • Copestake, J. (2014) ‘Credible impact evaluation in complex contexts: Confirmatory and exploratory approaches’. Evaluation, 20 (4), 412–27. • Fendler, L. (2016) ‘Ethical implications of validity-vs-reliability trade-offs in educational research’. Ethics and Education, 11 (2), 214–29. • Gale, T. (2017) ‘What’s not to like about RCTs in education?’. In Childs, A. and Menter, I. (eds) Mobilising Teacher Researchers: Challenging educational inequality. London: Routledge, 207–23. • Hayton, A. and Bengry-Howell, A. (2016) ‘Theory, evaluation, and practice in widening participation: A framework approach to assessing impact’. London Review of Education, 14 (3), 41–53. • Morrison, K. (2001) ‘Randomised controlled trials for evidence-based education: Some problems in judging • Nygaard, C. and Belluigi, D.Z. (2011) ‘A proposed methodology for contextualised evaluation in higher education’. Assessment and Evaluation in Higher Education, 36 (6), • Picciotto, R. (2012) ‘Experimentalism and development evaluation: Will the bubble burst?’. Evaluation, 18 (2), 213–29.