1 / 12

Making the links between research, evaluation, evidence and impact

Join Professor Jacqueline Stevenson from Sheffield Hallam University as she explores the meaning and purpose of evaluation, what can be measured about the impact of WP, and her personal tensions in practice. Revisit Annette Hayton and Dr. Andrew Bengry-Howell's framework to evaluate the impact of university-led outreach activities.

abradley
Download Presentation

Making the links between research, evaluation, evidence and impact

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Making the links between research, evaluation, evidence and impact Professor Jacqueline Stevenson Sheffield Hallam University Jacqueline.stevenson@shu.ac.uk

  2. Overview • Bring us back full circle to the meaning and purpose of evaluation • Explore what can be measured and claimed about the impact of WP • Explore my personal tensions in practice • Revisit Annette Hayton and Dr Andrew Bengry-Howell’s Framework to evaluate the impact of university-led outreach activities

  3. The three lives of Jacqueline Stevenson • Evaluation Manager • Researcher for refugee access to HE projects • Sociologist of Education and Professor of Education Research

  4. 1. The Evaluation Manager • Tasked with evaluating multiple Aimhigher and other funded projects • Role as an evaluation consultant • 2002: working in research and evaluation centre in a UK University; policy-related, short-term, mostly qualitative - focus groups or semi-structured interviews • Structure, role and relationships: control, neutrality, objective • Not indifferent to the research/research participants but worked to limit, and not embrace, my involvement in the research process

  5. ‘The contract researcher is permanently engaged in deploying her/his self to create intimate relationships which by their very nature are 'meaningful', before moving on to a new project with a new set of colleagues and research 'subjects'. 'The project' constitutes its own bounded social world within which meaning is constructed, and CRS are required to parcel that 'meaningfulness' up and leave it, and to re-create themselves anew in another arena. They have constantly to negotiate a series of beginnings and endings. Goode (2006, para 1.2)

  6. I was happy doing this.......Evaluation Indicators • Process indicators - dimensions concerned with actions and what needs to be done to achieve outcomes e.g. ways of doing admissions or giving out IAG, experiences offered e.g. Outreach events • Outputs indicators - the tangible and intangible products that result from project activities e.g. Numbers of completed personal statements • Outcome indicators - the observed effects of the output-specific results e.g. increased number of applications to HE; changes in confidence • Impact indicators - attainment of higher level strategic goals; sustained long term changes e.g. changes in the student demographic profile of HE

  7. Impact evaluation • Impact evaluation is structured to answer the question: “how would outcomes have changed if the intervention had not been undertaken?” • Therefore ideally requires a counterfactual analysis, i.e. “a comparison between what actually happened and what would have happened in the absence of the intervention” • In other words, impact evaluation assesses the changes that can be directly attributed to a particular intervention (both intended and unintended) • Is consequently different to outcome evaluation, which examines whether targets have been achieved

  8. Measuring or estimating impact? • In its truest sense, impact measurement would require using an independent evaluator, establishing control groups, measuring changes over extended periods of time etc. etc. • However, evaluation of WP activity is not a rigorous social science experiment - involves: • Multiple inputs from multiple agencies/individuals/sources • Cannot undertake a ‘blind’ control • Therefore determining an explicate causal relationship between a particular activity/intervention and a specific outcome is very difficult/impossible • Estimating impact canindicate that it is probable/possible/plausible that a particular intervention(s) contributed to a particular outcome

  9. 2. The researcher • First interview with a refugee (33 yr old Egyptian, Mohammed) • Interviews with 65+ refugees and asylum seekers • Completely new challenges • when and where should I collect data; what data should I collect; what methodological approaches should I adopt; how should I analyse my data; how should I use and disseminate my findings • More fundamentally, epistemological, methodological and ethical approaches to evaluation and research completely transformed - Refugees as data, as headlines • Needed to find a methodology: Story telling research found me: ‘The universe is made of stories, not of atoms’ (Muriel Rukeyser, 1958, “The Speed of Darkness) • Transformation charted in my field notes

  10. Forced to reconsider my approach • Evaluating WP using the processes referred to at the begin can drive who, what, how, when and why WP activities are undertaken – tail wagging the dog? • So this raises the questions of: • The place of ‘research’ in evaluation? • How we can theorise from what we find out? and how can theory drive our evaluation? • How we can link theory with practice? • What about praxis?

  11. Today: the (meta)reflexive researcher • Disclaimed the neutral observer status • I try to be brave: allow for self-vulnerability, emotion, challenge • Recognise the place of ethics-in-practice not ethics-in-process • Repositioned and relinquished power • Accepted and listen to my internal conversations • Accept personal responsibility • But I am still asked to run those workshops....

  12. So...from a personal perspective • This series has shown how rich, qualitative data can offer important insights; we need spaces to have critical conversations • Just because we can’t prove something works doesn’t mean we shouldn’t do it • We need to focus on the micro-successes • And I would argue that qualitative research needs to continue to explore the nuances and complexities of WP activities

More Related