500 likes | 622 Views
Innovation in Assessment and Evaluation. Prof. dr. Martin Valcke http://allserv.ugent.be/~mvalcke/CV/CVMVA.htm Ghent University Maputo July 2011. Structure. Advance organizer Traditional approaches Trends in assessment & evaluation Assumptions in relation to these trends Research
E N D
Innovation in Assessment and Evaluation Prof. dr. Martin Valcke http://allserv.ugent.be/~mvalcke/CV/CVMVA.htm Ghent University Maputo July 2011
Structure • Advance organizer • Traditional approaches • Trends in assessment & evaluation • Assumptions in relation to these trends • Research • Practices • Conclusions
Transmission of your message 3. Eye Contact Advance organizer
Critical issues • Validity of evaluationapproach in view of assessment of skills and complex knowledge • Fant et al., (1985) • Rating scales, daily logs, anecdotal records, behavior coding, and self-assessment for evaluating student teachers. • Oralexaminations, portfolio assessment, centralassessmentcentres, 360° assessment • …
Recent developments Group learner Expert eacher Teachers Individuallearner Assessment system Externalinstitution Institutional level
Recent developments • Stronger focus on “consequential validity”of measurement (Gielen, Dochy & Dierick, 2003) • Stronger emphasis on feedback value of assessment • What is the “learning potential” of the assessment approach
Recent developments • Stiggins (1987): performance assessment • Performance assessment is expected to be geared in a better way to assess complex behavior in medical, legal, engineering, … and educational contexts (Sluijsmans, et.al., 2004).
Concrete examples • Self- and peer assessment • Rubrics based assessment
Self- and peer assessment • Learn about your own learning process. • Schmitz (1994): “assessment-as-learning”. • ~ self corrective feedback
See experiential learning cycle of Kolb. • Boekaerts (1991) self evaluation as a competency. • Development of metacognitive knowledge and skills (see Brown, Bull & Pendlebury, 1998, p.181). • Freeman & Lewis (1998, p.56-59): developing pro-active learners
Is it possible? Group evaluations tend to fluctuate around the mean
Learning to evaluate • Develop checklists • Give criteria • Ask to look for quality indicators. • Analysis of examples good and less good practices: develop a quality “nose”
Learning to evaluate • Freeman & Lewis (1998, p.127) : • Learnerdevelops list of criteria. • Pairs of learnerscomparelisted criteria. • Pairs develop a criterion checklist. • Individualapplication of checklist. • Use of checklist to evalutework of otherlearner. • Individualreworkshis/her work. • Finalresult checkeed by teacher and resultcompared to learnerevaluation. • Pairs rechecktheirworkon the base of teacher feedback.
Learning to evaluate • Peer evaluation is not the same as Peer grading • Final score is given by teacher • Part of score could build on accuracy of self/peer evaluation and self-correction
Rubrics • Rubrics focus on the relationship between competencies, criteria, and indicators and are organized along mastery levels (Morgan, 1999).
http://web.njit.edu/~ronkowit/teaching/rubrics/samples/rubric_apa_research.pdfhttp://web.njit.edu/~ronkowit/teaching/rubrics/samples/rubric_apa_research.pdf
Rubrics • Rubric: scoring tool for a qualitative assessment of the quality level of an authentic or complex activity • A rubric builds on criteria, enriched with a scale to indicate a mastery level. • For each level, standards are indicated that reflect this level. • A rubric dictates both teacher and student what is concretely expected. • Rubrics are used for “high stake assessment” and “formative assessment” (Arter & McTighe, 2001; Busching, 1998; Perlman, 2003). • Rubrics focus on the relationship between competencies, criteria, are organized along mastery levels (Morgan, 1999).
Rubrics: indicator-based assessment • Assessment objective • Criteria • Enriched with indicators in terms of observable behavior • Limited number of indicators
Critical thinking rubric http://academic.pgcc.edu/~wpeirce/MCCCTR/Designingrubricsassessingthinking.html
Assumptions about rubrics • Larger consistency in scores (reliability). • More valid assessment of complex behavior. • Positive impact on related learning process.
Critical issues • Adoption of this assessment approach is marred by teacher beliefs about nature of evaluation (see e.g., Chong, Wong, & Lang, 2004); • Also student beliefs (Joram & Gabriele, 1998) • Validity of the criteria and indicators (Linn, 1990), • Reliability of performance evaluation, e.g., when multiple evaluators assess and score performance (Flowers & Hancock, 2003).
Research about rubrics use • Reviewarticle 75 studies aboutrubricsusage • Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educationalconsequences. Educational Research Review, 2, 130–144. • (1) the reliable scoring of performance assessments can be enhanced by the use of rubrics, especially if they are analytic, topic-specific, and complemented with exemplars and/or rater training; • (2) rubrics do not facilitate valid judgment of performance assessments per se. However, valid assessment could be facilitated by using a more comprehensive framework of validity; • (3) rubrics seem to have the potential of promoting learning and/or improve instruction. The main reason for this potential lies in the fact that rubrics make expectations and criteria explicit, which also facilitates feedback and self-assessment.
Conditions effective usage • Check frame of reference for the rubric: tasks, objectives • Train the users • Use multiple assessors: interrater usage • Developed by teacher and/or students!
Development of rubrics • Choose quality criteria: 4 to 15 statements describing the nature of a criterion • Determine bandwidth to judge differences in quality related to the criterion: e.g., 0-5 or qualitative descriptors • Elaborate descriptors for each bandwidth level: concrete operational terms • Start from available student work!
Rubrics: example • Writing a fiction story • Complex skill • Criteria?
http://isucomm.iastate.edu/files/image/OV3-WrittenAssignmentRubric.pnghttp://isucomm.iastate.edu/files/image/OV3-WrittenAssignmentRubric.png
Rubrics • Rubric: scoringstoolvooreenkwalitatieveinschatting van het niveau van eenauthentieke of complexeactiviteit. • Eenrubriekbouwtverder op criteria die verrijktworden met eenschaalwaaropbeheersingsniveauszijnaangegeven • Per beheersingsniveauwordenstandaardenaangegeven die dieniveauweerspiegelen. • Een rubric geeftzowelvoor de lesgeverals de student aanwatconcreetverwachtwordt. • Rubrics wordenvoor “high stake assessment” gebruikt en voor “formatievetoetsing” (in functie van leren).(Arter & McTighe, 2001; Busching, 1998; Perlman, 2003). • Rubrics focus on the relationship between competencies-criteria, and indicators and are organized along mastery levels (Morgan, 1999).
Performance assessment • Rubrics focus on the relationship between competencies-criteria, and indicators and are organized along mastery levels (Morgan, 1999).
Aanpak ontwikkeling rubric • Kies criteria voor verwacht gedrag • 4 tot 15 statements die criterium beschrijven • Bepaal bandbreedte die verschil in bereiken criterium weergeven • Bijv. 0-5 of kwalitatieve termen • Werk een beschrijving uit voor elke waarde in de bandbreedte • Concreet observeerbare/vaststelbare kwalificaties
More information? • Overview of tools, examples, theory, bacdkground, research: http://school.discoveryeducation.com/schrockguide/assess.html • Critical thinking rubrics: http://academic.pgcc.edu/~wpeirce/MCCCTR/Designingrubricsassessingthinking.html • Rubric generators: http://www.teach-nology.com/web_tools/rubrics/ • Interesting rubric sites: http://web.njit.edu/~ronkowit/teaching/rubrics/index.htm • Rubric APA research paper: http://web.njit.edu/~ronkowit/teaching/rubrics/samples/rubric_apa_research.pdf • K12 examples: http://school.discoveryeducation.com/schrockguide/assess.html • General intro and overview:http://web.njit.edu/~ronkowit/teaching/rubrics/index.htm
Statements about evaluation • Learners should be trained to develop themselves such rubrics. • Staff should collaborate in developing formal assessment and summative assessment rubrics • Rubrics will help staff to be more concrete as to their teaching and learning focus
Innovation in Assessment and Evaluation Prof. dr. Martin Valcke http://allserv.ugent.be/~mvalcke/CV/CVMVA.htm Ghent University Maputo July 2011