1 / 33

Benchmarking in e-learning: an overview

Benchmarking in e-learning: an overview. Professor Paul Bacsich Matic Media Ltd and Middlesex University, UK. The Menu. UK e-learning – why listen? Benchmarking overview Pick & Mix system MBS case study Conclusions. Myself. Consultant to several UK agencies & universities

taro
Download Presentation

Benchmarking in e-learning: an overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Benchmarking in e-learning:an overview Professor Paul Bacsich Matic Media Ltd and Middlesex University, UK

  2. The Menu • UK e-learning – why listen? • Benchmarking overview • Pick & Mix system • MBS case study • Conclusions

  3. Myself • Consultant to several UK agencies & universities • Adjunct professor at Middlesex University • Global Campus and School of Computing Science • Open University for 25 years • One of the former Directors of UK eUniversities, which aimed to be a global provider of e-learning • Current work includes developing a global benchmarking methodology for e-learning • Already piloted at Manchester Business School • Presented to EU conference in Brussels, and in Colombia, ALT-C, HEA and Sydney Uni

  4. Not in my talk! • Costs of e-learning (Activity Based Costing) • Competitor analysis of e-learning providers • What went wrong with UK eUniversities? • Two volumes of reports (35 chapters, over 2500 pages) and research overview available • More soon

  5. Why listen to UK? • UK has many years experience of quality management in universities, via various organisations • Latest is Quality Assurance Agency for Higher Ed • This has guidelines for quality in e-learning • UK has substantial experience in distance learning and e-learning including global delivery • Benchmarking is part of Higher Ed policy for e-learning (HE Academy, under way now) • UK-Australian collaboration/co-funding on a number of issues including e-Framework

  6. The main UK agencies • QAA – for quality • JISC – for support of ICT in universities • UKERNA to run the JANET high-speed network across all UK • Higher Education Academy (HEA) – for pedagogy • Some smaller agencies: • Leadership Foundation for HE (LFHE) • Observatory for Borderless HE (OBHE)

  7. Quality Assurance Agency UK • Covers all four UK home nations • “Code of practice for the assurance of academic quality and standards in higher education” • See www.qaa.ac.uk/academicinfrastructure/codeOfPractice/ • Not much on pedagogy – this is left to the discretion of the academic • Only 1 private uni in UK, over 100 public ones

  8. QAA in e-learning • “Collaborative provision and flexible and distributed learning (including e-learning)” • September 2004 • BUT • Some feel it says too little, others do not want to be restricted • It was too late – 4 years? • No international comparisons (whereas research has)

  9. Pedagogy • Higher Education Academy • “works with universities and colleges, discipline groups, individual staff and organisations to help them deliver the best possible learning experience for all students” • Runs Subject Centres for each subject • Advising on e-learning since early 2005 • Slowish progress

  10. JISC and JANET • Joint Information Systems Committee • ICT for universities and colleges (not schools) • England, Scotland, Wales, N Ireland • JANET is the UK National Academic and Research Network (JANET) • JISC funds JANET via UKERNA company

  11. In UK, universities compete- and now in e-learning • Universities want to judge how well they are doing in e-learning • Funding agencies and public want to know • But universities don’t want to tell if they are doing badly! • And universities (like people) are not good at judging themselves

  12. Benchmarking • Like Activity Based Costing (ABC), it has been around for many years • Unlike ABC, but like BPR, quality, excellence, etc; no one is now sure what it means…

  13. Back to Basics (Xerox) a process of self-evaluation and self-improvement through the systematic and collaborative comparison of practice [process] and performance [metrics, KPIs] with competitors [or comparators] in order to identify own strengths and weaknesses, and learn how to adapt and improve as conditions change.

  14. Benchmarking (in Universities) • There are several reports that will tell you how to do benchmarking in general • Higher Education Academy (UK) • Learning and Skills Development Agency (UK) • Department of Education Training and Youth Affairs (Australia)/Sydney Uni

  15. Benchmarking in e-Learning There are few published reports re approaches • My surveys and proposals • http://www.alt.ac.uk/altc2005/timetable/files/527/Benchmark_overview.doc • E-Learning Maturity Model (NZ) – Marshall • NUTN/Hezel emerging work (Jan 2006?) • Work by OECD and OBHE • National Learning Network (UK) – colleges • So far unpublished work (Sydney/OU and ACODE)

  16. Best Practice in e-Learning • There are a few reports (US): • APQC/SHEEO Study 1998 (US) • IHEP “Quality on the Line” 2000 (US) • And several projects (EU): • BENVIC • SEEQUEL • Swiss Virtual Campus @ Lugano: MINE (adapting the IHEP work for EU) • E-xcellence (EADTU and others)

  17. Benchmarking e-learning A global “synthesis” incorporating what work has been done elsewhere

  18. Focus of my work • Focussed purely on e-learning • But not to any particular style (e.g. DL) • Oriented to institutions past the “a few projects” stage • Suitable for desk research as well as “invasive” studies • Suitable for single- and multi-institution studies • Started work in Jan 2005, already piloted at Manchester Business School against 12 competitors world-wide

  19. Processes or Outputs? • Outputs: measure first (can be done by desk research) • Processes: later (best done in clubs or invasive studies) • Inputs: not of so much interest to students; but of course of great interest to funders

  20. Metrics or Bureaucratic • Use a 6-point scale • 5 from Likert plus 1 more for “excellence” • Backed up by metrics where possible • Also contextualised by narrative • Some issues of judging “best practice”; judging “better practice” is easier • e.g. VLE convergence • Some criteria are rather “criteria bundles”

  21. Other Decisions • Explicit (otherwise you are not trying) • Independent or collaborative • Internal or external • Horizontal • focus on processes across whole institution • but can look at individual projects, missions and departments to get “range of scores”

  22. How Many Benchmarks? • It is like ABC: how many activities? • Answer: Not 5, not 500 • Better answer: Well under 100 • Composite some criteria together • Remove any not specific to e-learning • Be careful about any which are not provably critical success factors • Institutions may wish to add specific ones to monitor their objectives and KPIs.

  23. How Many do Others Have? • LSDA (UK) has 14 – but colleges • IHEP (US) has 24 – but old • APQC/SHEEO (US) had 14 – but older • EMM (NZ) has 43 – but some are being merged and some are outside core e-learning area • OECD has many but several are “taxonomic” not critical success factors

  24. Pick and Mix System • Based on survey of “best of breed” ideas • 6-point scale (Likert + excellence) • Backed up by narrative and metrics • 18 core criteria (e-learning specific) • Can easily add more in same vein for local needs • Output and student-oriented aspects covered • Focussed on critical success factors • Methodology-agnostic • Requires no long training course to understand • But must know and be undogmatic about e-learning

  25. “Adoption phase” (Rogers) • Innovators only • Early adopters taking it up • Early adopters adopted; early majority taking it up • Early majority adopted; late majority taking it up • All taken up except laggards, who are now taking it up (or retiring or leaving) • First wave embedded, second wave under way (e.g. m-learning after e-learning)

  26. “Training” • No systematic training for e-learning • Some systematic training, e.g. in some projects and departments • U-wide training programme but little monitoring of attendance or encouragement to go • U-wide training programme, monitored and incentivised • All staff trained in VLE use, training appropriate to job type – and retrained when needed • Staff increasingly keep themselves up to date in a “just in time, just for me” fashion except in situations of discontinuous change

  27. “Accessibility” • e-learning material and services is not accessible • Much e-learning material and most services conform to minimum standards of accessibility • Almost all e-learning material and services conform to minimum standards of accessibility • All e-learning material and services conform to at least minimum standards of accessibility, much to higher standards • e-learning material and services are accessible, and key components validated by external agencies • Strong evidence of conformance with letter & spirit of accessibility in all countries where students study Too aspirational, too international, too regulated?

  28. Case StudyJan-Apr 2005 Manchester Business School within Manchester U (done by Matic Media Ltd)

  29. Methodology • Externally-focussed (internal going on now) • Looked at 12 “comparator” business schools (2 UK, 10 non-UK) – no time to discuss • Focus on speedy desk research (Web+DB) • Focus on criteria susceptible to that • plus “narratives of good practice” • Aim: to learn lessons for MBS

  30. A few MBS conclusions • Numeric (not so interesting) – “taxonomic” • Tabular (see next slide) • Lots of case study narrative (but structured) • Top-level conclusions include: • Saturation wireless networks universal • e-Portfolios used in “sandstone-level” • Alumni get same IT systems as students

  31. Work in progress • Presentation at UK HEA Town Meeting • Discussions with EU projects on “quality” and “excellence” • Implications of report on UKeU Committee for Academic Quality (in e-Learning) • Keynote at EFQEL conference • Workshop and Presentation at Online Educa Berlin (Nov/Dec 2005) • Detailed comparison of methodologies (NB costing) • See how this can be taken into account for UK HEA strategy for benchmarking e-learning • 12+60 HEIs to be involved in 2006

  32. Thank you for listeningAny questions? Professor Paul Bacsich bacsich@matic-media.co.uk www.cs.mdx.ac.uk/staff/profiles/p_bacsich.html

More Related