220 likes | 422 Views
Writing in Engineering – peer assessment?. Peer and collaborative assessment of written coursework in engineering modules Julia Shelton , Jens Mueller School of Engineering and Material Science Teresa McConlogue , Language and Learning Unit. Taught contact hours Problem Based Learning
E N D
Writing in Engineering – peer assessment? Peer and collaborative assessment of written coursework in engineering modules Julia Shelton, Jens Mueller School of Engineering and Material Science Teresa McConlogue, Language and Learning Unit
Taught contact hours Problem Based Learning Experiments + write ups Homework Problem solving through numerical examples 12 – 16 hours / week 25 hours / problem 40 hours / semester 3 hours / week 20 hours / week Load on Engineering students
What were the issues? • Content heavy courses, especially in the second year • No room in the curriculum for input on writing • Large classes – difficult to mark writing and give detailed and timely feedback
We wanted students to • be aware of tutor expectations in coursework • develop judgement on the quality of their work • become autonomous learners
What were the strategies? • Embedded writing tasks with clear task instructions and assessment criteria • to improve students’ understandings of tutor expectations • Peer assessment of these tasks • to develop students’ understanding of ‘quality’ in writing in engineering
Planning considerations • the importance of practice marking and preparation for students • the possibility and value of co-constructing assessment criteria • the use of multiple-markers • anonymity issues • the need to give guidance on feedback practices • consideration of procedures for resolving complaints • evaluation strategy
Engineering Case Studies • Level 5, Year 2 Medical materials module • 70 students • Level 4, Year 1 Fluid mechanics module • 280 students • Level 7, Year 4 Computational fluid dynamics module • 20 students • Level 7, Year 4 Implant design and technology module • 20 students
Implementation of peer assessmentlevel 5, 2nd year medical materials module • Students required to submit a 5 page technical lab report in standard structure (provided) • After report submission ran a preparation session • Rehearsal marking of 4 reports, ranked them and in groups discussed grades and comments • In a plenary session, grades were compared and reasons for discrepancies discussed • Reports anonymised, and allocated to peer assessors • Marks and comments submitted rapidly • Mean grades determined by module organiser and comments returned to students within a further week
Issues for consideration • Keeping anonymity • Individual marks given to students or mean mark • Assessing value of marker’s feedback • Checking on accuracy / validity of feedback • Rewarding student feedback • Uncertainty of value of marks from fellow students
Value of peer assessment exercise • Students formally saw other students’ reports • Formulating feedback provided some learning value • Students queried validity of feedback
Developments from 2009 • Implemented in large 1st year, level 4, module • Web interface for peer assessment, including submission, allocation, marks, comments and feedback
Implementation of peer assessmentlevel 7, 4th year Implant design and technology module • Students set task to write a report to a small company CEO to describe the background of a particular joint replacement in order to inform a decision regarding its possible development within the company. • Before students prepared and submitted reports, discussed and evolved suitable marking criteria; co-construction of assessment criteria • Students submitted reports; rehearsal marking on 3 reports • In groups students discussed grades and comments, grades compared and reasons for discrepancies discussed • Each report allocated to 4 peer assessors • marks and comments submitted rapidly • Mean grades determined and comments returned to students within a further week
Features • Introduced co-construction of assessment criteria • Students read and marked material not researched on their own • Used web interface for peer assessment, submission and comments
Student evaluation 2012 • Written open ended evaluation at end of module • Judging the reports • Your writing • Overall comments
Value in undertaking the peer assessment exercise • Learnt from unfamiliar topics ‘learnt on other topics without having to research...good as an overview’ • Saw examples of different styles and reports • Saw clearly what not to do ‘we learnt alot from others, their writing style, things to improve and things to avoid when writing’ • Understood value of report structure ‘I will link my paragraphs more, add subheadings to aid understanding, add images..’
Issues students reported • Difficult to assess unfamiliar topics ‘learning should be given about all the topics before conducting the review’ • Poor quality feedback ‘should make sure everyone puts the same effort in and those who don’t should be penalised’ • Time consuming • Difficult to grade – wanted more guidance on values to award ‘more structured in terms of specifying what makes a report very good (A) or how do you grade a report and give it B, C, D’ • Lacked confidence in final mark
How students would change their writing from this exercise • Change emphasis on elements • Answer the topic more precisely • Change their use of language • Add more tables and figures and describe them more fully • Improve the structure of the report • Focus on ensuring spelling and grammar correct
Recognition ‘ the marking criteria given doesn’t necessarily mean it is exhaustive and contains all factors of what makes an excellent/ideal report... Feedback was based on the marking criteria which was open to interpretation. This doesn’t necessarily mean it is not valid or makes peer assessment difficult, but that it is a subjective process’
Future implementations • Develop peer assessment in each year group • Utilise ranking more widely for higher level learning groups • Embed web based system fully
Conclusions • Peer assessment in SEMS is a useful tool • Students start to judge quality when asked to give feedback – may not have skills to award marks • Peer assessment has several complementary functions • Reliability of marks is not the most important parameter • Students learn from experience