1 / 28

Peer-Review/Assessment Aid to Learning & Assessment

Peer-Review/Assessment Aid to Learning & Assessment. Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University of Glamorgan. Defining Peer-Assessment. In describing the teacher ..

dinh
Download Presentation

Peer-Review/Assessment Aid to Learning & Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Peer-Review/AssessmentAid to Learning & Assessment Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University of Glamorgan

  2. Defining Peer-Assessment • In describing the teacher .. A tall b******, so he was. A tall thin, mean b******, with a baldy head like a light bulb. He’d make us mark each other’s work, then for every wrong mark we got, we’d get a thump. That way – he paused – ‘we were implicated in each other’s pain’ McCarthy’s Bar (Pete McCarthy, 2000,page 68)

  3. AUTOMATICALLY CREATE A MARK THAT REFLECTS THE QUALITY OF AN ESSAY/PRODUCT VIA PEER MARKING, AND ALSO A MARK THAT REFLECTS THE QUALITY OF THE PEER MARKING PROCESS i.e. A FAIR/REFLECTIVE MARK FOR MARKING AND COMMENTING

  4. Below are comments given to students.Place in Top FOUR Order of Importance to YOU • I think you’ve missed out a big area of the research • You’ve included a ‘big chunk’ that you haven’t cited • There aren’t any examples given to help me understand • Grammatically it is not what it should be like • Your spelling is atroceious • You haven’t explained your acronyms to me • You’ve directly copied my notes as your answer to the question • 50% of what you’ve said isn’t about the question • Your answer is not aimed at the correct level of audience • All the points you make in the essay lack any references for support

  5. Order of Answers • Were the results all in the ‘CORRECT’ order – probably not? • Why not! • Subject specific? • Level specific – school, FE, HE • Teacher/Lecturer specific? • Peer-Assessment is no different – Objectivity through Subjectivity

  6. Typical Assignment Process • Students register to use system - CAP • Create an essay in an area associated with the module • Provide RTF template of headings • Submit via Bboard Digital Drop-Box • Anonymous code given to essay automatically by system • Create comments database / categories

  7. Each Student is using a different set of weighted comments Comments databases sent to tutor

  8. First Stage => Self Assess own Work Second Stage (button on server) => Peer Assess 6 Essays

  9. Self/Peer Assessment • Often Self-Assessment stage used • Set Personal Criteria • Opportunity to identify errors • Get used to system • Normally peer-mark about 5/6 • Raw peer MEDIAN mark produced • Need for student to receive Comments + Marks • Need for communication element?

  10. AUTOMATICALLY EMAIL THE MARKER .. ANONYMOUS

  11. The communications element • Requires the owner of the file to ‘ask’ questions of the marker • Emphasis ‘should’ be on the marker • Marker does NOT see comments of other markers who’ve marked the essays that they have marked • Marker does not really get to reflect on their own marking – get a reflective 2nd chance • I’ve avoided this in past -> get it right first time

  12. Feedback Index • Produce an index that reflects the quality of commenting • Produce a Weighted Feedback Index • Compare how a marker has performed against these averages • Judge quality of marking and commenting i.e. provide a mark for marking AUTOMATICALLY

  13. CompensationHigh and Low Markers • Need to take this into account • Each essay has a ‘raw’ peer generated mark - MEDIAN • Look at each student’s marking and ascertain if ‘on average’ they are an under or over marker • Offset mark given by this value • Create a COMPENSATED PEER MARK

  14. How to work out Mark (& Comment) Consistency • Marker on average OVER marks by 10% • Essay worth 60% • Marker gave it 75% • Marker is 15% over • Actual consistency index (Difference) = 5 • This is done for all marks and comments • Creates a consistency factor for marking and commenting

  15. Marks to Comments Correlation • Jennifer Robinson – a third of comments not useful • Liu – Holistic comments not specific • Davies – Really good correlation between marks and comments received

  16. Automatically Generate Mark for Marking • Linear scale 0 -100 mapped directly to consistency … the way in HE? • Map to Essay Grade Scale achieved (better reflecting ability of group)? • Expectation of Normalised Results within a particular cohort / subject / institution?

  17. Current ‘Simple’ Method • Average Marks • Essay Mark = 57% • Marking Consistency = 5.37 • Ranges • Essay 79% <-> 31% • Marking Consistency 2.12 <-> 10.77 • Range Above Avge 22% <-> 3.25 (6.76=1) • Range Below Avge 26% <-> 5.40 (4.81=1)

  18. Innovation Grant Proposal • Put the emphasis on the marker to get it right • Get the opportunity to ‘reflect’ on COMMENTS before go back to essay owner • 2nd chance – not sure if I want the results to have a major effect – hope they get it right the 1st time – consistency • Is there a Need to have discussion between markers at this stage? – NO as it is dynamic • Will review stage remove need for compensation?

  19. Used on Final Year Degree + MSc DEGREE DCS • 36 students on module • 192 markings • 25 ‘replaced’ markings out of 192 (13%) • Average time per peer marking = 37 minutes • Range of time taken to do markings 6-116 • Average number of menu comments/marking = 9.8 • Raw average mark for essays = 61% • Out of the 25 Markings ‘replaced’ (1 student replaced a marking twice) only 6 marks changed 6/192 (3%) • Number of students who did replacements = 11(out of 36) • 1 student ‘Replaced’ ALL his/her markings • 6 markings actually changed mark +7, -4, -9, +3, -6, +6 (Avge = -0.5)

  20. Used on Final Year Degree + MSc MSc EL&A • 13 students • 76 markings • 41 replaced markings (54%) • Average time per marking = 42 minutes • Range of time taken to do markings 3-72 minutes • Average number of menu comments/marking = 15.7 • Raw average mark = 61% • Out of 41 Markings ‘replaced’ –> 26 changed mark 26/76 (34%) • Number of students who did replacements = 8 (out of 13) • 2 students ‘Replaced’ ALL his/her markings • 26 markings actually changed mark • -1,+9, -2,-2, +1, -8, -3,-5, +2, +8, -2, +6, +18(71-89), -1, -4, -6, -5, -7, +7, -6, -3, +6, -7, -7, -2, -5 (Avge -0.2)

  21. Current Conclusions • The results of the mapping of the compensated peer-marks to the average feedback indexes are very positive. Although the weighted development of the average feedback index only produces a slight improvement to an already very positive correlation, it addresses a concern that the subjectivity of the comments derived from the menu driven system were not totally subjective. • The main concern of this method of automatically developing a mark for marking & commenting is the mapping of the consistency factors to an absolute grade. It should be kept in mind how difficult it currently is to explain to a student why they have been awarded 69% and their colleague has 71% within a traditional assessment. • Review Stage -> Tangible or Non-Tangible -> MARKS OR REFLECTION

  22. Some Points Outstanding or Outstanding Points • What should students do if they identify plagiarism? • What about accessibility? • Is a computerised solution valid for all? • At what age / level can we trust the use of peer assessment? • How do we assess the time required to perform the marking task? • What split of the marks between creation & marking

  23. Contact Information pdavies@glam.ac.uk Phil Davies J316 X2247 University of Glamorgan

More Related