230 likes | 315 Views
An experience of peer evaluation in a b-learning environment. EDMEDIA 2014, 23 to 26 juin 2014 Tampere-Finland. The Kelluwen b-learning community. Who participate at Kelluwen ? Teachers and students open to innovation processes in their classes From 2010 we work with a community
E N D
An experience of peer evaluation in a b-learning environment EDMEDIA 2014, 23 to 26 juin 2014 Tampere-Finland
Who participate at Kelluwen? Teachers and students open to innovation processes in their classes From 2010 we work with a community 3 regions (Los Ríos, Los Lagos y Aysén) 17 cities 57 schools 93 teachers and 4517 students 160classes from Valdivia to Aysén
Which is the opportunity that address Kelluwen? Motivation of the students with the Social Web tools Large inversion in technologic infrastructure in the schools but underused Low development of socio-communicative skills in the students
Howwecontribute to thescholarcontext? Motivatingthecreation of a learningcommunity Coordinatingscholarnetworksbymeans of a didacticproposal (Didacticdesign) and a supportingWeb platform
The peer review module in Kelluwen platform • The Didactic Designs (DDs) consider students working in teams within each classroom • For some activities within the DDs, we promote that teams from different schools and different geographical locations do peer review activities • Coordination between different schools is challenging, and to support this process we implemented a peer review module, called Works Tool
The peer review module in Kelluwen platform: teacher’s view of reviews
An example of peer review Imagen con la pauta de evaluación
Qualitative study of the peer review module Sample of the survey questions for students (top) and teachers (bottom).
Classroom and data Didactic Design 21: Literary and non-literary in Youtube
Classroom and data Didactic Design 88: Building a slideshow of the Twentieth Century
Analysis and results Correlations between two reviewers
Analysis and results p-values of the Wilcoxon test to contrast localization parameters of score distributions H0: Ls = Lt vs H1: Ls > Lt
Analyzing quality of comments • Two independent judges evaluated the comments with respect to three criteria: assessment, feedback, and formwith a four level rubric: • Weighted Kappa statistics to measure agreement between judges
Analyzing quality of comments • Correlations between reviewers’ scores and judges’ labels
Analyzing comments: same class vs twin classes p-values of the Wilcoxon test to contrast localization parameters of label distributions
Conclusions • We developed a peer review module in our Web platform that • support the pairing process • set a non-anonymity environment where • students can review, send free text comments and discuss among them • teacher can monitor the work • We analyze potential differences between peer reviews when they are done by students: • on the same class group or • from class groups of different schools. • We analyze several class groups doing review activities in which pairing resulted to be mixed.
Conclusions We found significant differences between reviewers from the same class versus those from different classes, in the review scores and in the quality of feedback When reviewers are from the same class, they tend to give higher scores to their peers and tend to write more accurate assessment comments. This seems to confirm the idea that peer reviews gain quality when reviewers know the reviewed students, in the sense that non anonymity make reviewers more aware of what they are reviewing.