300 likes | 460 Views
Standards, quality assurance, best practice and benchmarking in e-learning. Professor Paul Bacsich Matic Media Ltd, and Middlesex University, UK. The Menu. Standards (technical) Quality Assurance Standards (content) Standards (pedagogy and process) Best Practice Excellence?
E N D
Standards, quality assurance, best practice and benchmarking in e-learning Professor Paul Bacsich Matic Media Ltd, and Middlesex University, UK
The Menu • Standards (technical) • Quality Assurance • Standards (content) • Standards (pedagogy and process) • Best Practice • Excellence? • Benchmarking • Conclusions
Standards (technical) • UK follows mainly IMS • Agency called CETIS set up by JISC to advise universities and colleges on IMS • A few mega universities (OU, Ufi, etc) are direct members of IMS • IMS Learning Design gaining influence • Also e-portfolios
Standards (Content) • Quality Assurance Agency has set up “subject benchmarks” • More about generalised competences than detailed syllabi • See www.qaa.ac.uk/academicinfrastructure/benchmark/
Standards (pedagogy and process) • Quality Assurance Agency (QAA) • “Code of practice for the assurance of academic quality and standards in higher education” • See www.qaa.ac.uk/academicinfrastructure/codeOfPractice/ • Not much on pedagogy – this is left to the discretion of the professor
QAA in e-learning • Little has been done specifically on e-learning – but see… • “Collaborative provision and flexible and distributed learning (including e-learning)” • Recent – September 2004 • Some feel it says too little, others do not want to be restricted
Digression on Pedagogy • Higher Education Academy • “works with universities and colleges, discipline groups, individual staff and organisations to help them deliver the best possible learning experience for all students” • Runs Subject Centres for each subject • Beginning to advise on e-learning
Best practice in e-learning • Not much studied in the UK yet • OU a major source of advice • UKeU set up to crystallise best practice into an operational business • It failed – but its legacy may help • Committee for Academic Quality • US much more active – see e.g. “Quality on the Line” (IHEP, 2000)
In UK, universities compete- and now in e-learning • Universities want to judge how well they are doing in e-learning • And funding agencies also want to know • But universities don’t want to tell if they are doing badly! Not the public, not the funding agencies. • And universities (like people) are not good at judging themselves.
Benchmarking • Like Activity Based Costing, it has been around for many years • Unlike ABC, but like BPR, quality, excellence, etc; no one is now sure what it means…
Back to Basics (Xerox) a process of self-evaluation and self-improvement through the systematic and collaborative comparison of practice [process] and performance [metrics, KPIs] with competitors [or comparators] in order to identify own strengths and weaknesses, and learn how to adapt and improve as conditions change.
Implicit Independent Internal Vertical Inputs or Processes Metric Explicit Collaborative [clubs] External Horizontal Outputs Qualitative (After Jackson) Benchmarking Dichotomies
Focus of my work • Focussed purely on e-learning • But not to any particular style (e.g. DL) • Oriented to institutions past the “a few projects” stage • Suitable for desk research as well as invasive studies • Suitable for single- and multi-institution studies
Benchmarking (in Universities) • There are several reports that will tell you how to do benchmarking in general • Higher Education Academy (UK) • Learning and Skills Development Agency (UK) • Department of Education Training and Youth Affairs (Australia)
Benchmarking (in Universities) • And some agencies can help: • European Benchmarking Programme on University Management (ESMU, Brussels) • English Universities Benchmarking Club
Benchmarking in e-Learning • There are veryfew reports • National Learning Network (UK) –not for universities, but for colleges • E-Learning Maturity Model (NZ) – brand new!
Quality/Best Practice in e-Learning • There are a few reports (US): • APQC/SHEEO Study 1998 (US) • IHEP “Quality on the Line” 2000 (US) • And several projects (EU): • BENVIC • SEEQUEL • Swiss Virtual Campus @ Lugano: MINE
Excellence (?) in e-Learning • New project: E-xcellence (EADTU and others) • Outside e-learning, several projects: • Consortium for Excellence in Higher Education (UK)
Benchmarking e-learning A “synthesis”
Processes or Outputs? • Outputs first (can be done by desk research) • Processes later (best done in clubs or invasive studies) • Inputs not of interest to students; but of course of interest to funders
Metrics or Bureaucratic • Use a 6-point scale • 5 from Likert plus 1 more for “excellence” • Backed up by metrics where possible • Also contextualised by narrative • Remember the problems of judging “best practice”; judging “better practice” is easier
Other Decisions • Explicit (otherwise you are not trying) • Independent or collaborative • Internal or external • Horizontal: focus on processes across the whole institution; do not be seduced into individual projects
How Many Benchmarks? • It is like ABC: how many activities? • Answer: Not 5, not 500. • Better answer: Well under 100. • Composite some criteria together • Remove any not specific to e-learning • Be careful about any which are not provably critical success factors.
How Many do Others Have? • LSDA (UK) has 14 • IHEP (US) has 24 • APQC/SHEEO (US) had 14 • (Breaking news) EMM (NZ) has 43
Pick and Mix System • 25 criteria (liable to grow to around 30) • 6 levels, backed up by qualitative and numeric information • Student-oriented • Focussed on critical success factors • Requires no long training course to understand, if you know about e-learning • Methodology-agnostic
“Adoption phase” (Rogers) • Innovators only • Early adopters taking it up • Early adopters adopted; early majority taking it up • Early majority adopted; late majority taking it up • All taken up except laggards, who are now taking it up (or retiring or leaving) • First wave embedded, second wave under way (e.g. m-learning after e-learning)
“Training” • No systematic training for e-learning • Some systematic training, e.g. in some projects and departments • U-wide training programme but little monitoring of attendance or encouragement to go • U-wide training programme, monitored and incentivised • All staff trained in VLE use, training appropriate to job type – and retrained when needed • Staff increasingly keep themselves up to date in a “just in time, just for me” fashion except in situations of discontinuous change
Next Steps • Correlate with “quality” and “excellence” projects in EU • Publish a review report on UK Committee for Academic Quality (in e-Learnng) August • Review underpinning methodologies (CMM etc) • Literature search outside Europe, US and Commonwealth • Series of workshops • at ALT-C 2005 Manchester September • At ACODE Australia November • at Online Educa Berlin December
Thank you for listeningAny questions? Professor Paul Bacsich Global Campus, Middlesex University p.bacsich@mdx.ac.uk www.cs.mdx.ac.uk/staff/profiles/p_bacsich.html