1 / 28

‘At the Coal Face’ Experiences of Computer-based Exams

John Winkley, BTL Paul Humphries, Edexcel Dorit Reppert, CCEA July 8 th 2003. ‘At the Coal Face’ Experiences of Computer-based Exams. Evaluating Computer-based Exams. Key Stakeholders: candidates test centres examiners Awarding Bodies. Key Success Factors: engagement

bela
Download Presentation

‘At the Coal Face’ Experiences of Computer-based Exams

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. John Winkley, BTL Paul Humphries, Edexcel Dorit Reppert, CCEA July 8th 2003 ‘At the Coal Face’ Experiences of Computer-based Exams

  2. Evaluating Computer-based Exams • Key Stakeholders: • candidates • test centres • examiners • Awarding Bodies • Key Success Factors: • engagement • accessibility • technical usability • system reliability • security • test validity, reliability • data analysis

  3. Project Background PEP 2 Project • Schools: • 19 schools • GCSE style exams • over 1,000 candidates • approx. 1,400 exams • learndirect & Army Test Centres: • 13 test centres • real National Tests • over 300 candidates • approx. 300 tests

  4. Assessment Producer marked ‘papers’ exam ExamBase Client ExamBase Client ExamBase Client The ExamBase System MicroBoard website data exam Test Centre • Standard PCs • Not limit use of PCs at other times • Secure, Reliable • Easy-to-use, Engaging, Accessible ExamBase Server marked question question

  5. partially marked ‘papers’ Test Centre On-screen marking MicroBoard fully marked ‘papers’ MarkerBase Server partially marked ‘papers’ Marking Centre exam Examiner Examiner

  6. MicroBoard Web services Web interface Web service consumer Test Centre ExamBase Server ExamBase Client Exam Security • Secure internet (https/ssl) • Encrypted file transfer • Encrypted file transfer (htps/ssl/proprietary)

  7. MicroBoard

  8. MicroBoard

  9. MicroBoard

  10. ExamBase Server

  11. ExamBase Server (red) Downloaded but lockedThis exam has been downloaded but has not been scheduled to run (orange) Downloaded and ScheduledThis exam has been scheduled but is not accessible by candidates until the scheduled start time. (green) Unlocked and ready to runThe four hour scheduling window has begun and the exam is ready to be accessed by candidates. (yellow) PausedThe exam is currently paused by the administrator. (blue) FinishedThe exam has been scheduled, should have been used and the four hour window for access has now finished or the exam has been closed (see 3.5 Finish exam).

  12. Example On-ScreenAssessment Questions

  13. Priority 2 – an appropriate exam environment: • setting-up takes longer • can be difficult to provide • adequate privacy The Test Centre Experience • Priority 1 – a reliable technical infrastructure: • includes trained support staff • unreliability may interrupt an exam • funding, management processes are key factors

  14. The Test Centre Experience • Schools: • computer-based exams entirely new to schools • transfer of responsibility from administrators to technical staff • 48% found software installation easy or very easy • 54% had no problems with candidate registration • all schools ran exams successfully

  15. The Test Centre Experience • learndirect centres : • technical infrastructure already exists for e-learning and formative assessment • used to maintaining ‘business critical’ IT systems • 95% of 29 Skills for Life project centres found the installation easy or very easy • all centres found registration of candidates using MicroBoard easy

  16. Adult Learners: • great disparity of IT experience • some with no IT experience • have opted for National Tests The Candidate Experience • School Pupils: • ‘digital natives’ • at ease with IT • over 94% use a computer at • home at least once a week • exams are mandatory

  17. The Candidate Experience • School Pupils: • Overwhelmingly positive • 92% said they enjoyed the tests “more enjoyable than writing” • Adult Learners: • computer-based exams seen as new and positive • “almost unanimously they reported preferring to take an online test rather than a paper test” - Ufi “It didn’t feel like an exam”

  18. The Candidate Experience • Common concerns: • technology failures: • rare but disconcerting • mainly due to PC system problems • candidates can continue on a spare PC without losing work • the exam environment • keyboard noise in particular

  19. MarkerBase

  20. The Examiner Experience • General responses: • 9 out of the 10 examiners found the marking either very enjoyable or enjoyable • all found the software either very easy or easy to use • most would prefer to work at home • liked choice of being able to mark ‘by candidate’, ‘by question’ or both

  21. The Examiner Experience • Potential for more rapid marking : • marking speed increased with experience • marking ‘by question’ increases efficiency • More accurate tallying: • automatic tallying of marks • error-prone task removed

  22. The Awarding Body Experience • Engaging learners: • closer to day-to-day experiences for many • exam stress reduced • positive impact on self-esteem • more engaging than paper-based exams

  23. The Awarding Body Experience • Accessibility: • Much greater flexibility: • can register candidates up to the exam start • ‘on demand testing’ will be possible • ‘on screen’ approach allows tests to be taken: • in mobile situations (e.g. touring bus) • in remote locations (e.g. using laptops)

  24. Quality assurance: • Less opportunity for human error • Quality checks have to be refocused • Security: • less vulnerable to lost or mishandled papers • encryption technologies increase security The Awarding Body Experience

  25. The Awarding Body Experience • Exam reliability: • 100% reliability for ‘closed’ question types • ‘open’ question types: • marking remains subjective • MarkerBase helps examiners to be more consistent

  26. The Awarding Body Experience • Exam validity: • Good correlation with paper-based National Tests • GCSEs, using wider range of question types, will require additional validation effort

  27. Results generation: • possibility of more rapid feedback to candidates • provide results to test centre rather than to candidates The Awarding Body Experience • Data analysis: • richer feedback to candidates and test centres • easier to detect anomalies in candidate responses

  28. Conclusions • very popular with candidates - more engaging and interactive, but additional validation needed • greater security and reduced risk of human error • improved accessibility and flexibility – highly desirable for Awarding Bodies • quality of technical infrastructure is vital • may initially place a burden on test centres, but with the promise of efficiency gains over time

More Related