1 / 20

Online interactive computer marked assignments – active learning or just doing it?

Online interactive computer marked assignments – active learning or just doing it?. Frances Chetwynd, Chris Dobbyn and Helen Jefferis. Agenda. Project background – HEA funded i nteractive Computer M arked Assignments ( iCMAs ): Students - attitudes and engagement

moswen
Download Presentation

Online interactive computer marked assignments – active learning or just doing it?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Online interactive computer marked assignments – active learning or just doing it? Frances Chetwynd, Chris Dobbyn and Helen Jefferis

  2. Agenda • Project background – HEA funded • interactive Computer Marked Assignments (iCMAs): • Students - attitudes and engagement • Authors – some recommendations • Teams - predicting student failure

  3. Project background • Based on Open University distance learning module • Level 1; 60 credits • 9 months duration • Large population • Local tutors • Central module team

  4. Project background • Assessment • 7 iCMAs; 40% average • 4 TMAs; 40% average • End of module assignment; 40% pass mark • Data gathering • Student survey: online 175 students • VLE usage statistics: 2500 students

  5. iCMA • 20 questions • 3 tries each question • Increasing feedback • Unlimited attempts each quiz • 5 question variants • Previous scores wiped

  6. Students – objectives • Based on student survey (51/175 responded) • Reviewers(54.9%) “Used iCMAs as guide to note taking. Didn’t submit until after completing relevant sections” • Previewers(23.5%) “To see what I knew already so I could miss out studying parts of the text….” (8%) • Cruisers (21.6%) “I only completed them in order to pass the module”

  7. Students - VLE Analysis • Patterns of completion: early and mid-module iCMA51 iCMA54 SUBMISSION OF BEST ATTEMPT

  8. Students - VLE Analysis

  9. Students - VLE Analysis • Number of attempts recorded iCMA56 iCMA51

  10. Tutors – from student survey

  11. Authors – student survey Student views on feedback for each attempt

  12. Authors – student survey Comparing iCMAs with in text self-assessment questions (SAQs) • 70% iCMA questions made me think more • 57% SAQs more useful when I had no idea • 59% Completing iCMAs is more fun • 14% I didn’t answer any SAQs

  13. Authors– VLE analysis How daring are we?

  14. Teams – The Borderliners • Gather VLE and other data • Identification of early indicators of failure • Some students belong to the category of borderline failures • Score just below EMA pass mark • Do not submit EMA despite respectable TMA grades • Develop software tool to predict failure

  15. Teams – predicting failure • 165 students – stayed engaged; • Failed final assessment iCMA51 – whole cohort iCMA51 – the ‘Borderliners’ SUBMISSION OF BEST ATTEMPT

  16. Teams – predicting failure

  17. Teams – The Borderliners • Artificial neural network Predicted groups – e.gBorderliners; clear pass; out Indicators (22) – e.g. all assignment scores and completion; age; motivation.

  18. Teams – The Borderliners • Predicted categories passed on to tutors • Help targeted at Borderliners • Focus on real-time contact

  19. Summary • iCMAs new form of assessment • Questions to answer: Lack of student engagement – why? Student motivations? Best practice for authors? Data for analytics?

  20. References • Crisp, B. (2007) 'Is it worth the effort? How feedback influences students' subsequent submission of assessable work', Assessment & Evaluation in Higher Education, 32:5, 571 - 581 • Jelfs, A. and Whitelock, D (2000). The notion of presence in virtual learning environments: what makes the environment "real". British Journal of Educational Technology, 31(2), pp. 145–15 • Jordan, S. (2011) ‘Using interactive computer‐based assessment to support beginning distance learners of science’, Open Learning: The Journal of Open, Distance and e-Learning, 26:2, 147-164 • Roediger, H.L., & Butler, A.C. (2011) ‘The critical role of retrieval practice in long-term retention’. Trends in Cognitive Sciences’, 15, 20-27 • Timmers, C., Braber-van den Broek, J. and van den Berg, J. (2012) ‘Motivational beliefs, student effort, and feedback behaviour in computer-based formative assessment’, Computers and Education, 60, 25-31

More Related