1 / 12

Student assessment: lightening the load while increasing the learning

Explore strategies to lighten the assessment load for staff, increase student engagement, and boost learning outcomes. Discover innovative approaches to student involvement, mechanization, and strategic decision-making to enhance assessment processes.

eallison
Download Presentation

Student assessment: lightening the load while increasing the learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Student assessment: lightening the load while increasing the learning Dr Chris Rust Head, Oxford Centre for Staff and Learning Development Deputy Director, Assessment Standards Knowledge Exchange (ASKe) Centre for Excellence in Teaching and Learning Oxford Brookes University, UK

  2. 3 possible ways of reducing staff assessment load • Involve the students • Mechanise • Strategic programme decisions

  3. Self-assessment Strengths of this piece of work Weaknesses in this piece of work How this work could be improved The grade it deserves is….. What I would like your comments on

  4. Marking exercise Immediate results participants av. mk non participants av. mk. Cohort 1 (99/00) 59.78 54.12 Cohort 2 (00/01) 59.86 52.86 Cohort 3 (01/02 55.7 49.7 Results 1 year later Cohort 1 57.91 51.3 Cohort 2 56.4 51.7 Rust, C., Price, M & O’Donovan, B.(2003) "Improving students’ learning by developing their understanding of assessment criteria and processes” Assessment and Evaluation in Higher Education, Vol. 28, No. 2

  5. Peer marking using model answers (Forbes & Spence, 1991) Scenario: • Engineering students had weekly maths problem sheets marked and problem classes • Increased student numbers meant marking impossible and problem classes big enough to hide in • Students stopped doing problems • Exam marks declined (Average 55%>45%) Solution: • Course requirement to complete 50 problem sheets • Peer assessed at six lecture sessions but marks do not count • Exams and teaching unchanged Outcome: Exam marks increased (Av. 45%>80%)

  6. Peer marking using model answers 2 91-92 tutor assessed92-93 self-assessed (N=75) Pharmacology practicals at Leeds Hughes, I.E. (1995) “Peer Assessment”, Capability 1 (3)

  7. Peer feedback (Rust, 2001) Scenario • Geography students did two essays but no apparent improvement from one to the other despite lots of tutor time writing feedback • Increased student numbers made tutor workload impossible Solution: • Only one essay but first draft required part way through course • Students read and give each other feedback on their draft essays • Students rewrite the essay in the light of the feedback • In addition to the final draft, students also submit a summary of how the 2nd draft has been altered from the1st in the light of the feedback Outcome: Much better essays

  8. Mechanise assessment • Statement banks • Assignment attachment sheets • Optical mark reader • Computer aided-assessment

  9. Mechanise assessment - 1 Statement Banks Write out frequently used feedback comments, for example: • I like this sentence/section because it is clear and concise • I found this paragraph/section/essay well organised and easy to follow • I am afraid I am lost. This paragraph/section is unclear and leaves me confused as to what you mean • I would understand and be more convinced if you gave an example/quote/statistic to support this • It would really help if you presented this data in a table • This is an important point and you make it well etc…….

  10. Weekly CAA testing - case study data (Brown, Rust & Gibbs,1994)

  11. CAA quizzes (Catley, 2005) Scenario • First term, first year compulsory law module • A new subject for most (75%) students • High failure rate (25%), poor general results (28% 3rd class, 7% Ist) Solution: • Weekly optional WebCT quizzes (50% take-up) Outcome: Quiz takers: 4% fail, 14% 3rd class, 24% Ist Non-quiz takers: same pattern as before Overall: 14% fail (approx half previous figure) 21% 3rd class 14% 1st (double previous figure)

  12. Assessing a selection (Rust, 2001) Scenario: • Weekly lab reports submitted for marking • Increased student numbers meant heavy staff workload and increasingly lengthy gap before returned so feedback of limited/no use Solution: • Weekly lab reports still submitted • Sample number looked at, and generic feedback e-mailed to all students within 48 hours • At end of semester, only three weeks’ lab reports selected for summative marking Outcome: • Better lab reports and significantly less marking

More Related