450 likes | 634 Views
Self and Peer Assessment Through DUO. Hannah Whaley University of Dundee. Background. University of Dundee Growing use of many forms of self and peer assessment Paper based systems and ad hoc online approaches Re-developed a system to integrate with Bb New system User centered design
E N D
Self and Peer Assessment Through DUO Hannah Whaley University of Dundee
Background • University of Dundee • Growing use of many forms of self and peer assessment • Paper based systems and ad hoc online approaches • Re-developed a system to integrate with Bb • New system • User centered design • Involved academic staff from range of subjects • Partnered with Blackboard • Fully integrated with Blackboard • Released in v8.0 so available now
1. Understanding Self and Peer Assessment 2. Deciding Where and When 3. Designing Assessments 4. Running Assessments
1. Understanding Self and Peer Assessment 2. Deciding Where and When 3. Designing Assessments 4. Running Assessments
1. Understanding Self and Peer Assessment • Important to fully understand the concept • Confusion over terminology • Focus on the real use of the pedagogy • Only then can you realise the full potential for learning criterion based reference marking peer marking peer review self assessment peer reflection marking rubrics critical analysis groupwork assessment
1. Understanding Self and Peer Assessment not group fixed marking criteria individual work self and peer assessment reflection, analysis, evaluation
1. Understanding Self and Peer Assessment • Process • Academics design assessment • Includes questions and marking criteria • Creates the assessment in Blackboard • Student completes the assessment • Could be one or more questions • Submits answers in Blackboard • Student marks the assessment • Returns to assessment in Blackboard • Is given a list of students to mark • Academic moderates results • Monitors submissions and marking phases • Moderates results before releasing them to students
1. Understanding Self and Peer Assessment • Challenging process for both staff and students Academics Students Reflection Critical Constructive Engage Be creative Be precise Let go! Moderate Sys Admin Understand, support and get excited
1. Understanding Self and Peer Assessment 2. Deciding Where and When 3. Designing Assessments 4. Running Assessments
Formative or summative? • Formative works particularly well • Summative should include moderation offer • Replace old assessment or add new one? • Updating old assessments works well • Chance to add innovative new practice • What’s the purpose of the assessment? • Add interaction, reduce marking load, extra practice, new skills • Focus on purpose in assessment design • How long should it run for? • 2 weeks standard – use the defaults that are given • 1 week assessment, 1 week evaluating • Supervise it in IT suites or not? • Generally, can be completed entirely online • Makes good use of practical session 2. Deciding Where and When
2. Deciding Where and When HIGH LOW interactivity
2. Deciding Where and When HIGH LOW interactivity
2. Deciding Where and When HIGH LOW interactivity
2. Deciding Where and When HIGH LOW interactivity
2. Deciding Where and When HIGH LOW interactivity
2. Deciding Where and When Self and Peer Traditional Creates Question Creates Question Prepares Answer Prepares Answer Creates Criteria Creates Criteria Marking Answers Writing Feedback Marking Answers Writing Feedback Moderation Reviews Feedback Reviews Feedback Moderation Formal marks Formal marks Exercise review Exercise review
1. Understanding Self and Peer Assessment 2. Deciding Where and When 3. Designing Assessments 4. Running Assessments
Focus of assessment • Learning objectives (primary and secondary) • Discipline specific context • Flexibility within tool for design 3. Designing Assessments
3. Designing Assessments - Essay style and exam style assessments are catered for - Submission options include text, html and links - Anonymous or not, change number of peers to mark Assessment Question Criteria Question Criteria Criteria Criteria
3. Designing Assessments Example 1 Subject: Life Sciences Motive: Reduce Marking Time Extra Learning: Practice at exam q’s Old or New: Created from old tutorials Very specific criteria Exam style 30 Questions 1 criteria each Model answers
3. Designing Assessments Example 1 Subject: Life Sciences Motive: Reduce Marking Time Extra Learning: Practice at exam q’s Old or New: Created from old tutorials Subject specific Text answers File uploads Majority no feedback Marking 2 peers and self
3. Designing Assessments Example 1 Subject: Life Sciences Motive: Reduce Marking Time Extra Learning: Practice at exam q’s Old or New: Created from old tutorials Only 1 exercise used per year previously 4 hours moderating 4 exercises Students gain lots of practice Common mistakes, marking scales, model answers
3. Designing Assessments Example 2 Subject: Geography Motive: Innovative practical lab Extra Learning: Understand their answers Old or New: New idea Bit of Both 2 Questions Subjective and Specific criteria Granular and expansive marks
3. Designing Assessments Example 2 Subject: Geography Motive: Innovative practical lab Extra Learning: Understand their answers Old or New: New idea Deep Learning Text answers Subjective and Specific criteria Marking 3 peers and self
3. Designing Assessments Example 2 Subject: Geography Motive: Innovative practical lab Extra Learning: Understand their answers Old or New: New idea Innovative way to introduce students to academic reading Promoting deep learning – synthesis and evaluation Students forced to give opinions and justify them Makes use of flexibility of the system, combining two approaches
3. Designing Assessments Example 3 Subject: Law Motive: Improve assessment Extra Learning: Give better feedback Old or New: Added online component Blended style File upload Open Criteria, with guidance Small workload
3. Designing Assessments Example 3 Subject: Law Motive: Improve assessment Extra Learning: Give better feedback Old or New: Added online component Quick answers 3 markers Self reflection Emphasis on constructive feedback
3. Designing Assessments Example 3 Subject: Law Motive: Improve assessment Extra Learning: Give better feedback Old or New: Added online component Blending online component with existing teaching practices Enhance face to face section, formalise feedback Students get better feedback, from a wider range Understand better and worse presentations clearly
1. Understanding Self and Peer Assessment 2. Deciding Where and When 3. Designing Assessments 4. Running Assessments
Flexibility built into system • Timing of assessments • Workload • Publishing Results • Motivation • Moderation 4. Running Assessments
4. Running Assessments • Motivation • Student understanding of process • Importance for their learning • Assignment completed after both parts • Marks can be withheld • Actively encourage non completers • Email and remind them • Sometimes… • Marks for marking • Deviation from average or tutor mark
4. Running Assessments • Moderation • Moderator can over ride any average grade • 3 key phases each with moderator overview • Submission • Marking • Results
4. Running Assessments • Moderating Submissions • Encourage • Check submissions for problems • Download submissions
4. Running Assessments • Moderating Evaluations • Encourage • Check for problems • Download evaluation results
4. Running Assessments • Moderating Results • Check for problems • Finalise and publish to Grade Center • Any grade can be over-ridden in the Grade Center • Add feedback or grading notes
4. Running Assessments • Moderation styles • On request (recommended) • Highs and Lows • Unexpected • Random Sample (recommended) • Understand the process • Its student marking • Can’t get a ‘correct’ grade • Accept the average and the learning
Common Mistakes • Not understanding it • Poor criteria • Overloading students • Obsessive moderating
Common Mistakes • Not using the preview • Changing questions and criteria • Changing dates back and forth • Changing enrolments
Benefits • Re-usable resource – shareable good practice • Moved from paper based and ad hoc systems • Promoting really deep learning • Comprehension, application, synthesis, evaluation • Students learn ‘soft skills’ • Giving effective feedback, analysing, criticising • Students gain learning skills • Assessment criteria, marking, answering questions • Students can place their work • See work better and worse than their own, monitor their own learning
Some Ideas… • First drafts • Review resources • Portfolio submission • Video • Past Papers • Research…
Conclusions • Experiences gained using the system for 2 years • Flexible, robust and expandable pedagogy • Challenge in creating challenging assessments • Benefit from experience of moderating • Not always easy • May not get right first time • Inspired, motivated, ideas forming?
Contact • Hannah Whaley, University of Dundee, Scotland • h.whaley@dundee.ac.uk