230 likes | 388 Views
Using e-assessment to support distance learners of science Sally Jordan and Philip Butcher GIREP-EPEC-PHEC 2009. Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT). My plan. Two presentations describing work in e-assessment The context – the UK Open University
E N D
Using e-assessment to support distance learners of scienceSally Jordan and Philip ButcherGIREP-EPEC-PHEC 2009 Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT)
My plan • Two presentations describing work in e-assessment • The context – the UK Open University • Why e-assessment? • Some examples of our use of e-assessment • Evaluation • And then a more specific example of our work…
The UK Open University • Founded in 1969; • Supported distance learning; • 150,000 students, mostly studying part-time; • Undergraduate courses are completely open entry, so students have a wide range of previous qualifications; • Normal age range from 18 to ?? • 10,000 of our students have declared a disability of some sort; • 25,000 of our students live outside the UK.
Implications for assessment • Within the Open University context, learners are geographically separated and we cannot assume that they will meet their tutor in order to receive feedback. • We are seeking to provide students with feedback on e-assessment tasks which is personalised and received in time to be used in future learning. • We are using small regular e-assessment tasks to help students to pace their study. • We are also using e-assessment tasks to encourage students to reflect on their learning and to enter into informed discussion with their tutor. • We are using e-assessment tasks in a range of ways e.g. summative, formative-only, diagnostic
The OpenMark system • Uses a range of question types, going far beyond what is possible with multiple choice; • Question types include: numerical input, text input, drag and drop, hotspot; • Students are allowed three attempts with an increasing amount of teaching guidance, wherever possible tailored to the student’s previous incorrect answer; • Different students receive variants of each question so each has a unique assignment. • OpenMark has been incorporated into Moodle, the open source virtual learning environment being used by the Open University.
Embedding iCMAs (interactive computer marked assignments) – an example • The assessment strategy for S104 : Exploring Science includes 8 TMAs (tutor-marked assignments), 9 iCMAs and a written End of Course Assignment • Home experiments, DVD activities, web-based activities and contributions to online tutor group forums are assessed, as is reflection on learning and on previously provided feedback. • Integration is key: are we talking about assessment or learning? • iCMAs are credit bearing (summative) but low stakes. Is this the best approach?
Evaluation methodologies for iCMAs • Ask students • Observe students • Analyse data • Ask students • Observe students
The current project.. • Is at the data analysis stage • Looking at use of iCMAs made by students in a wide range of different settings: Range: S103 (e-assessment was an added extra) to S151(ECA is an iCMA); Summative: S104, S154, SDK125 (iCMAs embedded within assessment strategy; low stakes summative); Formative: S279, S342, practice iCMAs for S154, SDK125, S151; Maths Skills Questions; Thresholded: new Physics and Astronomy courses; SM358 is formative-only, but with a ‘carrot’; Diagnostic: ‘Are you ready for?’ quizzes.
Conclusions • Students appear to engage with summative iCMA questions at a deeper level than when they are in formative-only use; • However there are issues, especially preoccupation with the minutiae of grading; • Does the answer lie in thresholding or in formative-only use with a ‘carrot’?; • Some of the findings are unexpected; we are currently investigating the reasons for these.
Some more surprises… • S104 students are more likely than others to look at all the questions before attempting any. Why? • There have been some surprises in looking at actual student responses (more later..); • It appears that student perception of what they are ‘meant’ to do is a very strong driver; • And other course components have a part to play…
Acknowledgments • Funding from COLMSCT and piCETL; • The assistance of many people associated with COLMSCT and piCETL, especially Spencer Harben and Richard Jordan; • The co-operation of the course teams involved in the investigation.
Sally JordanOpenCETL The Open UniversityWalton HallMilton KeynesMK7 6AAs.e.jordan@open.ac.uk http://www.open.ac.uk/colmsct/projects/icma http://www.open.ac.uk/picetl/ http://www.open.ac.uk/openmarkexamples/