200 likes | 360 Views
Online interactive computer marked assignments – active learning or just doing it?. Frances Chetwynd, Chris Dobbyn and Helen Jefferis. Agenda. Project background – HEA funded i nteractive Computer M arked Assignments ( iCMAs ): Students - attitudes and engagement
E N D
Online interactive computer marked assignments – active learning or just doing it? Frances Chetwynd, Chris Dobbyn and Helen Jefferis
Agenda • Project background – HEA funded • interactive Computer Marked Assignments (iCMAs): • Students - attitudes and engagement • Authors – some recommendations • Teams - predicting student failure
Project background • Based on Open University distance learning module • Level 1; 60 credits • 9 months duration • Large population • Local tutors • Central module team
Project background • Assessment • 7 iCMAs; 40% average • 4 TMAs; 40% average • End of module assignment; 40% pass mark • Data gathering • Student survey: online 175 students • VLE usage statistics: 2500 students
iCMA • 20 questions • 3 tries each question • Increasing feedback • Unlimited attempts each quiz • 5 question variants • Previous scores wiped
Students – objectives • Based on student survey (51/175 responded) • Reviewers(54.9%) “Used iCMAs as guide to note taking. Didn’t submit until after completing relevant sections” • Previewers(23.5%) “To see what I knew already so I could miss out studying parts of the text….” (8%) • Cruisers (21.6%) “I only completed them in order to pass the module”
Students - VLE Analysis • Patterns of completion: early and mid-module iCMA51 iCMA54 SUBMISSION OF BEST ATTEMPT
Students - VLE Analysis • Number of attempts recorded iCMA56 iCMA51
Authors – student survey Student views on feedback for each attempt
Authors – student survey Comparing iCMAs with in text self-assessment questions (SAQs) • 70% iCMA questions made me think more • 57% SAQs more useful when I had no idea • 59% Completing iCMAs is more fun • 14% I didn’t answer any SAQs
Authors– VLE analysis How daring are we?
Teams – The Borderliners • Gather VLE and other data • Identification of early indicators of failure • Some students belong to the category of borderline failures • Score just below EMA pass mark • Do not submit EMA despite respectable TMA grades • Develop software tool to predict failure
Teams – predicting failure • 165 students – stayed engaged; • Failed final assessment iCMA51 – whole cohort iCMA51 – the ‘Borderliners’ SUBMISSION OF BEST ATTEMPT
Teams – The Borderliners • Artificial neural network Predicted groups – e.gBorderliners; clear pass; out Indicators (22) – e.g. all assignment scores and completion; age; motivation.
Teams – The Borderliners • Predicted categories passed on to tutors • Help targeted at Borderliners • Focus on real-time contact
Summary • iCMAs new form of assessment • Questions to answer: Lack of student engagement – why? Student motivations? Best practice for authors? Data for analytics?
References • Crisp, B. (2007) 'Is it worth the effort? How feedback influences students' subsequent submission of assessable work', Assessment & Evaluation in Higher Education, 32:5, 571 - 581 • Jelfs, A. and Whitelock, D (2000). The notion of presence in virtual learning environments: what makes the environment "real". British Journal of Educational Technology, 31(2), pp. 145–15 • Jordan, S. (2011) ‘Using interactive computer‐based assessment to support beginning distance learners of science’, Open Learning: The Journal of Open, Distance and e-Learning, 26:2, 147-164 • Roediger, H.L., & Butler, A.C. (2011) ‘The critical role of retrieval practice in long-term retention’. Trends in Cognitive Sciences’, 15, 20-27 • Timmers, C., Braber-van den Broek, J. and van den Berg, J. (2012) ‘Motivational beliefs, student effort, and feedback behaviour in computer-based formative assessment’, Computers and Education, 60, 25-31