360 likes | 457 Views
Welcome to Assessing a Skill or Performance. FOL Part 1 2013 Dave Stewart Loyalist College. 1. Getting started Welcome and session outcomes Story Time (Connection Activity) Identifying best assessment practices 2. Reviewing key concepts ( Content Activity )
E N D
Welcome to Assessing a Skill or Performance • FOL Part 1 2013 • Dave Stewart • Loyalist College
1. Getting started Welcome and session outcomes Story Time (Connection Activity) Identifying best assessment practices 2. Reviewing key concepts (Content Activity) What, Why, How, When and Who of Performance Assessment 3. Exploring – small group work (Practice Activity) Creating, administering and evaluating checklists, grading scales and rubrics 4. Summarizing and sharing(Summary Activity) Agenda
Session Outcomes • During this workshop you will • identify tools that can be used with performance assessments, • Distinguishbetween rubrics, checklists and rating scales, • explore and share tips to guide the assessment of performances.
Acknowledgement Assessing a Skill or Performance
On your own think of an assessment method (for a practical skill) you have used with students or that you have encountered in your past that left you feeling satisfied with the outcome. Record on the paper provided Think pair share 1 minute
Pair with a person beside you and share your answer. Be sure to include WHY you found it a valuable assessment tool Think pair share 2 minutes
In your group Further discuss what you believe are characteristics of “best assessment practices” and record individually on post-it notes provided When you are finished, place post it notes on the flip chart provided at the front of the class Group Brainstorming Exercise 5 minutes
What did you come up with? Brilliant!
The Way we Were! • Knowledge in the hands of the experts (Behaviorist paradigm) • Teacher - centered • Focus on assessment of discrete, isolated knowledge and skills • Traditional f-2-f delivery • Norm-referenced testing • Emphasis on summative assessments • Limited perspectives (e.g. teacher) • Product based
The Way we Are! • Knowledge collaboration, cooperation and community (Constructivist paradigm) • Immersive environments - Learning IN Technology - learning anytime any where – distributive learning • Student centered – outcome based (complex, integrated learning)real-world context • Criterion-referenced • Assessment as learning (Formative) • Multiple perspectives (self, peer, teacher) • Product and process based (authentic assessments)
5 minutes Reviewing key conceptsDefinitions • Turn to page 8 of your handout. Complete the mix and match with a partner • Assessment • Evaluation • Performance/Authentic Assessment • Validity • Reliability • Assessment Task Attributes • Learning Outcome • Diagnostic, Summative and Formative Evaluation • Rubric • Performance Scale • Performance Check list • Performance criteria (descriptors)
Performance Assessment – close proximity to “actual criterion situation”. Usually measures complex skills, cognitive processes and communication important in real world (contextualized tasks) (Palm, 2008). What is Performance/Authentic Assessment? • Authentic Assessments –defining features are the specific cognitive processes (disciplined inquiry) and products (knowledge beyond the mere reproduction of presented knowledge) considered important in the perspective of life beyond school (Newman & Archibald, 1992). http://jolt.merlot.org/documents/vol1_no1_mueller_001.pdf
Performance assessment, also known as alternative or authentic assessment, is a form of testing that requires students to perform a real life task rather than select an answer from a ready-made list. Definition for Today http://abcresource.loyalistcollege.ca/learningassessment.htm#performance
Lots of controversy/confusion over these terms Definitions Assessment= feedback on practice; supports learning but may or may not generate marks/grades Evaluation= summary measurement providing some kind of grade/mark/final feedback Assessment vs. Evaluation
Why and When? The Purpose of Assessment: Assessment FOR, AS and OF Learning (Earl, 2003)
Reliability Validity and Reliability • Validity • Validity refers to the degree to which an assessment measures the intended learning outcome. • i.e. Does the test measure what it is intended to measure? (the learning outcome?) • Reliability refers to the consistency of test scores. Reliability affected by testing conditions, rater variability. • i.e. Does the test produce consistent results?
Measures what it is supposed to measure (valid) Measures the same information consistently (reliable) Recognizes that some aspects of learning are hard to measure or may be unplanned …is valid and reliable
Multiple perspectives Self Peer Teacher Who is the “assessor”?
Participates in group problem solving: 4 (Outstanding) 3 (Satisfactory) 2 (Tolerable) 1 (Unsatisfactory) Sample Performance/Rating Scale
Assessment task (performance, product or process) determined to reflect learning outcome. Assessment task attributes – essential elements or characteristics of a good performance of an assessment task Performance criteria (descriptors) – specific behavioral descriptions of performance at each level of performance Selecting the Performance/skill Task
Parts of a rubric Assessment task attribute Descriptors
Group Assignment Page 6
Assign group roles as follows: Performance expert _________________ Student ___________________________ Peer (reporter)_____________________ Teacher (timekeeper)________________ Learning Activity: collaboratively create, administer and evaluate a performance assessment tool. Page 6
In consultation with the performance expert, determine performance attributes and criteria (what you feel is essential to achieve the learning outcome) for the identified task. Learning Outcome: Accurately measure and report a radial pulse. OR Learning Outcome: Design, construct and fly a paper airplane. Instructions
In consultation with the performance expert, determine performance attributes and criteria (what you feel is essential to achieve the learning outcome) for the identified task. Collaboratively develop the assessment tool (checklist, rating scale or rubric) using template provided. The performance expert instructs the student on the performance task using the assessment tool as a guide. The student completes a “return demonstration” of the performance task. Complete the following steps
The teacher and peer complete the developed assessment tool based on the observed student’s performance. Discuss the advantages and disadvantages of the developed assessment tool from a variety of perspectives (student, peer and teacher). You can record on your handout. Be prepared to share your group’s finding with the larger group. Steps continued
Review the learning outcome and purpose of the assessment tool (validity) Collect samples of student work that exemplify levels of performance (validity) Consult with experts in fields to develop “authentic performance tests” and to validate task attributes and performance descriptors (validity, authenticity) Develop performance scoring instruments collaboratively with colleagues Final tips
Gather multiple perspectives on scoring of same performance (rater reliability) Share scoring rubric with students in advance of the performance task (equity, transparency, accountability) Share performance exemplars (sample projects, assignments, video depictions) with students in advance of performance task (transparency, accountability) Be prepared to review and revise assessment tool (accountability, validity) Final tips Continued
Did we achieve the learning outcomes of this session? identify tools that can be used with performance assessments, Distinguishbetween rubrics, checklists and rating scales, explore and share tips to guide the assessment of Performances. In Conclusion