1 / 21

Evaluating WP initiatives: Overcoming the Challenges

This session explores the challenges faced in evaluating access and participation initiatives in higher education and provides guidance on effective evaluation strategies. Topics include OfS expectations, access and participation standards of evidence, and enhancing outreach evaluation.

ldugan
Download Presentation

Evaluating WP initiatives: Overcoming the Challenges

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating WP initiatives: Overcoming the Challenges Evidence and evaluation @officestudents Richard Shiner & Joanne Moore Office for Students University of Exeter 28 February 2019

  2. Session plan • OfS strategy and expectations • Evidence and evaluation strategy, effective practice guidance and evaluation self-assessment • Access and participation standards of evidence • Over to you: enhancing evaluation of outreach exercise • Understanding effective outreach project: experience of providers • Discussion

  3. Quick update on OfS • New approach to access and participation including: • Ambitious plans to eliminate gaps in inequality • Reforms to A&P plans • Higher expectations on evaluation practice and evidence generation • Section 29 guidance – what to include in an A&P plan • A new strategy for evidence and evaluation • Section 35 guidance – effective practice guidance • Evidence and Impact Exchange • Evaluation self-assessment tool • Standards of Evidence for A&P • Guidance on evaluating outreach

  4. OfS regulatory guidance on evaluation • A plan must include: • a description of a robust and credible evaluation strategy which demonstrates continuous improvement in practice for the duration of the plan • a description of the mechanisms in place to enable the outcomes of evaluation to influence practice. • The OfS expects this to include: • an evaluation strategy, informed by a provider’s self-assessment of their approach to evaluation • information about how the provider will evaluate the impact of those areas where they are investing heavily • a description of how the provider uses evidence and evaluation findings to inform programme design targeting underrepresented groups for whom the largest gaps in access, success and progression have been identified.

  5. OfS regulatory guidance on evaluation cont… • A plan may include: • an overview of findings from completed self-assessment tools • a description of how a provider has used the OfS financial support evaluation toolkit • a description of collaboration between providers (for example, sharing good practice or developing evaluation centres) • a description of the process a provider has or plans to have in place to share findings both internally and externally.

  6. Evidence and evaluation strategy Evidence and evaluation are used effectively by the Office for Students and higher education providers to drive improvements in access and participation.

  7. OfS effective practice guidance • A more strategic approach to using evidence and evaluation e.g. logic chains, Theory of change

  8. Access and participation standards of evidence • Aim: to promote understanding of the standards of evidence and promote a more rigorous approach to undertaking and using impact evaluation to improve the effectiveness of access and participation programmes. • Content: sets out the standards of evidence and discusses how higher education providers can strengthen their evidence. It gives guidance on what type of evaluation to aim for, ways to strengthen the evidence, and discusses the claims that can be made from different types of evidence. • Audience: aimed at senior managers and decision-makers in providers, and practitioners with a remit for evaluation and reporting.

  9. Implications for practice • Type 1 (Narrative) is expected as a minimum for all types of A&P activities; • Type 2 (Empirical) is expected for long-term or multi-activity interventions and resource intensive activities. • Type 3 (Causal) is not an expectation - recommended for certain types of costly/ innovative/ pilot interventions (where expertise and resources exist for an experimental or quasi-experimental design). • Key things to consider: • Proportionality of evaluation to spend • Existing evidence base • Utility of findings (e.g. what you want to prove from your evaluation)

  10. Over to you: Enhancing evaluation of outreach • Here are some examples of approaches to evaluating different types of outreach • Questions: • What type of evaluation is this? • Why? • What’s good (if anything) about the approach? • How could the approach be improved?

  11. Feedback on the examples • Evidence supporting a Type 1 evaluation: • An evidence-base for what you are doing. • A well articulated conceptual framework which describes how your activities will lead to the outcomes and processes involved • Evidence supporting a Type 2 evaluation: • Able to demonstrate a change above and beyond what might otherwise have occurred. • Drawn from different research traditions and evaluation approaches • Evidence supporting a Type 3 evaluation: • A research design methodology that establishes the extent to which observed results are caused by an intervention.

  12. The understanding effective evaluation of outreach project • Method: • Collaborative work with partner universities and third sector parties • Development of guidance and case studies of evaluation practices • Partners: • Research phase: University of Plymouth; Royal Northern College of Music; University of Liverpool; Coachbright; Brightside; The Access Project; The Sutton Trust; Loughborough University; University of Exeter • Self-assessment road testing phase: MMU; Open University; SOAS; Aston University; LIPA; Bishop Groseteste; University of Birmingham; London School of Management Education; UCEM; University of Liverpool

  13. Key findings • Structural issues: lines of responsibility, application of resources for evaluation, systems for data and tracking. • Standards of evaluation need to be applied flexibly • Range of different types of outcome measures: more attention needed to links between intermediate indicators and long term progression outcomes • Student tracking processes are a particularly important building block for future outreach impact evaluation studies • One of the key issues is applying appropriate expertise in evaluation techniques and data analysis

  14. Recommendations for providers • A senior level WP evaluation contact • A culture of evaluation within institutions • Regular cycle of project and programme review • Clearly articulated and measurable short, medium, and long-term outcome measures • Identify a skills base/expertise for undertaking and commissioning evaluation • Collaborative partnerships to share expertise • Mechanisms to enable evaluation results to influence practice internally and externally

  15. Dimensions of the self-assessment tool

  16. Key principles of evaluation self-assessment • Open and honest approach in order to identify where the approach can be strengthened • Requires judgments about what evaluation is most appropriate in each context • Collaborative (different teams/individuals involved) • Seeking to embed evaluation at different stages of the project/programme planning cycle • Use evaluation not just to ‘prove’ but to ‘improve’

  17. Road-testing the self-assessment process • Process of self-assessment was largely helpful to support organisational development. • Subjectivity involved (results not comparable) • Should be part of a cycle every few years • Issues about content/coverage of the tool • Responsibilities not necessarily ‘joined up’ across student lifecyle • Wide range of contexts, mixed use of standards • This is challenging work and support is needed

  18. Discussion • Questions/queries? • Is self-assessment useful? • What are the opportunities and constraints? • Do you have any suggestions on how the approach to supporting provider self-assessment on evaluation and evidence can be supported and improved?

More Related