1 / 17

Aims:

To prove, to improve or to learn? Lessons for evaluating technical, practical and vocational educational initiatives from an analysis of STEM evaluations Edge Research Conference 2012 Friday November 16th. Aims:. Drawing on an analysis of evaluations in STEM education we will:

walker
Download Presentation

Aims:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. To prove, to improve or to learn? Lessons for evaluating technical, practical and vocational educational initiatives from an analysis of STEM evaluations Edge Research Conference 2012 Friday November 16th

  2. Aims: Drawing on an analysis of evaluations in STEM education we will: • identify some of the problems with such evaluations; • examine some potential solutions for the future; • and examine the applicability to wider TPVL.

  3. Introduction CSE and CEIR Science and Innovation Observatory Priorities • Research, intelligence, evaluation, polemics • Informing and influencing • Independent body • Reports, think-pieces, associates

  4. Background - what is STEM and how does it relate to TVPL? • In UK - developed from SET for Success (Roberts, 2002) • STEM Framework launched 2007-8 • Development of a set of 11 'ActionProgrammes' • Each of these contains more projects, many of which were evaluated - these from the bulk of our analysis • Technical vocational routes part of some Action Programmes, but main focus was academic

  5. What do we know about STEM evaluations? • Analysis of 20 STEM evaluations: • 13 Projects/activities or programmes • 4 Event evaluations • 2 Evaluations of organisations • 1 CPD evaluation

  6. Examined: • Aims • Timings • Methods • Evaluation models • Use of prior evidence • Results and outcomes • Impact on policy and practice • Limitations • Contribution to knowledge

  7. Key points from the review • Evaluation aims were not always explicitly stated. • Timings do not always appear to match the purposes of the initiative being evaluated. • Robust counterfactuals were rarely used. • Explicit evaluation models were used in only a small number of cases. • Reviews of literature, policy or similar initiatives were not usually presented.

  8. Key points from the review continued • Negative results and were not usually presented in the same depth as positive results. • Few evaluations looked to make recommendations beyond the project at hand. • Evaluations tended not to make explicit their limitations. • Contributing to a developing STEM knowledge base is very rare in the evaluations we looked at. •  Conclusion: The potential for learning from these evaluations is severely limited.

  9. Linked to key point: the purposes of Evaluation • Controlling • To understand whether the project is going to plan • Proving • To understand if the project is achieving what was intended • Improving • To understand how to modify the initiative to make it work better • Learning • To provide transferable insights to help build a body of knowledge beyond the project at hand

  10. Responses :1A single evaluation framework? • E.g. Stake’s (1996) Stufflebeam (2002): Cronbach's (1982) • Each of these organises the focuses of evaluation into three broad areas: • context [antecedent, context, unit of focus/setting]; • process [transaction, input/process, treatment]; and • outcome [outcome, product, observations/outcomes].

  11. Responses :1 Guskey? • No support for this idea: • One approach could not be designed that would be appropriate to the aims of every STEM project or evaluation. • A multiplicity of approaches allows greater fit, flexibility and creativity: and hence is more likely to lead to transferable learning.

  12. Responses 2: Theory-based approaches There are a number of well established 'theory-based' approaches e.g. Realist evaluation; Theory of Change. These develop hypotheses about the social world, and test them out using a variety of means. Close to the scientific method.

  13. EXAMPLE - Interventions aimed at directly improving students’ attitudes to STEM subjects • EXAMPLE THEORY - using interesting, innovative opportunities to learn improves attitudes to STEM hence improved learning outcomes and interest in STEM careers (e.g. After school Science and Engineering Clubs; Engineering Education Scheme)

  14. Next steps for STEM: • Development of effective use of theory-based approaches to evaluation. • Systematic mining of current evaluation and research to develop a bedrock of evidence of the theoretical bases for initiatives, and their effectiveness in various contexts. •  A commitment to using and building the evidence base through evaluation and research.

  15. Next steps for technical, practical and vocational learning (TVPL)? Questions: • Is there evidence that there is a similar lack of impact of evaluations in relation to TVPL? • What analysis needs to be done to help answer this question? • What needs to be done in TVPL to improve evaluation - and to what extent do the prescriptions in this paper for STEM evaluation apply to TVPL?

  16. Want to get involved? Contact us: Mike Coldwell m.r.coldwell@shu.ac.uk 0114 225 6054 Ken Mannion k.mannion@shu.ac.uk www.scienceobservatory.org.uk

More Related