440 likes | 630 Views
Welcome to Assessing a Skill or Performance. FOL Part 1 2012 Sandy Odrowski Durham College. Agenda. 1. Getting started Welcome and session outcomes Story Time (Connection Activity) Identifying best assessment practices 2. Reviewing key concepts ( Content Activity )
E N D
Welcome to Assessing a Skill or Performance FOL Part 1 2012 Sandy Odrowski Durham College
Agenda 1. Getting started • Welcome and session outcomes • Story Time (Connection Activity) • Identifying best assessment practices 2. Reviewing key concepts (Content Activity) • What, Why, How, When and Who of Performance Assessment 3. Exploring – small group work (Practice Activity) • Creating, administering and evaluating checklists, grading scales and rubrics 4. Summarizing and sharing(Summary Activity)
Session Outcomes During this workshop you will • identify tools that can be used with performance assessments, • Distinguishbetween rubrics, checklists and rating scales, • explore and share tips to guide the assessment of performances.
Wilma and Betty had different clinicians assessing their performance
Wilma Rocks 5/5
Betty didn’t do so well Wilma was very surprised! They compared impressions… 3/5 Wilma gave Betty some advice…
Assessing a Skill or Performance Acknowledgement
Think pair share • On your own think of an assessment method (for a practical skill) you have used with students or that you have encountered in your past that left you feeling satisfied with the outcome. • Record on the paper provided 1 minute
Think pair share • Pair with a person beside you and share your answer. • Be sure to include WHY you found it a valuable assessment tool 2 minutes
Group Brainstorming Exercise In your group • Further discuss what you believe are characteristics of “best assessment practices” and record individually on post-it notes provided • When you are finished, place post it notes on the flip chart provided at the front of the class 5 minutes
What did you come up with? Brilliant!
The Way we Were! • Knowledge in the hands of the experts (Behaviorist paradigm) • Teacher - centered • Focus on assessment of discrete, isolated knowledge and skills • Traditional f-2-f delivery • Norm-referenced testing • Emphasis on summative assessments • Limited perspectives (e.g. teacher) • Product based
The Way we Are! • Knowledge collaboration, cooperation and community (Constructivist paradigm) • Immersive environments - Learning IN Technology - learning anytime any where – distributive learning • Student centered – outcome based (complex, integrated learning)real-world context • Criterion-referenced • Assessment as learning (Formative) • Multiple perspectives (self, peer, teacher) • Product and process based (authentic assessments)
5 minutes Reviewing key conceptsDefinitions Turn to page 8 of your handout. Complete the mix and match with a partner Learning Outcome Diagnostic, Summative and Formative Evaluation Rubric Performance Scale Performance Check list Performance criteria (descriptors) • Assessment • Evaluation • Performance/Authentic Assessment • Validity • Reliability • Assessment Task Attributes
What is Performance/Authentic Assessment? Performance Assessment – close proximity to “actual criterion situation”. Usually measures complex skills, cognitive processes and communication important in real world (contextualized tasks) (Palm, 2008). Authentic Assessments –defining features are the specific cognitive processes (disciplined inquiry) and products (knowledge beyond the mere reproduction of presented knowledge) considered important in the perspective of life beyond school (Newman & Archibald, 1992). http://jolt.merlot.org/documents/vol1_no1_mueller_001.pdf
Definition for Today • Performance assessment, also known as alternative or authentic assessment, is a form of testing that requires students to perform a real life task rather than select an answer from a ready-made list. http://abcresource.loyalistcollege.ca/learningassessment.htm#performance
Assessment vs. Evaluation • Lots of controversy/confusion over these terms Definitions • Assessment= feedback on practice; supports learning but may or may not generate marks/grades • Evaluation= summary measurement providing some kind of grade/mark/final feedback
Why and When? The Purpose of Assessment: Assessment FOR, AS and OF Learning (Earl, 2003)
Validity and Reliability Validity Reliability Reliability refers to the consistency of test scores. Reliability affected by testing conditions, rater variability. i.e. Does the test produce consistent results? • Validity refers to the degree to which an assessment measures the intended learning outcome. i.e. Does the test measure what it is intended to measure? (the learning outcome?)
…is valid and reliable • Measures what it is supposed to measure (valid) • Measures the same information consistently (reliable) • Recognizes that some aspects of learning are hard to measure or may be unplanned
Who is the “assessor”? • Multiple perspectives • Self • Peer • Teacher
Sample Performance/Rating Scale Participates in group problem solving: • 4 (Outstanding) • 3 (Satisfactory) • 2 (Tolerable) • 1 (Unsatisfactory)
Selecting the Performance/skill Task • Assessment task (performance, product or process) determined to reflect learning outcome. • Assessment task attributes – essential elements or characteristics of a good performance of an assessment task • Performance criteria (descriptors) – specific behavioral descriptions of performance at each level of performance
Parts of a rubric Assessment task attribute Descriptors
Group Assignment Page 6
Learning Activity: collaboratively create, administer and evaluate a performance assessment tool. Page 6 Assign group roles as follows: • Performance expert _________________ • Student ___________________________ • Peer (reporter)_____________________ • Teacher (timekeeper)________________
Instructions • In consultation with the performance expert, determine performance attributes and criteria (what you feel is essential to achieve the learning outcome) for the identified task. Learning Outcome: Accurately measure and report a radial pulse. OR Learning Outcome: Design, construct and fly a paper airplane.
Complete the following steps In consultation with the performance expert, determine performance attributes and criteria (what you feel is essential to achieve the learning outcome) for the identified task. Collaboratively develop the assessment tool (checklist, rating scale or rubric) using template provided. The performance expert instructs the student on the performance task using the assessment tool as a guide. The student completes a “return demonstration” of the performance task.
Steps continued The teacher and peer complete the developed assessment tool based on the observed student’s performance. Discuss the advantages and disadvantages of the developed assessment tool from a variety of perspectives (student, peer and teacher). You can record on your handout. Be prepared to share your group’s finding with the larger group.
Final tips • Review the learning outcome and purpose of the assessment tool (validity) • Collect samples of student work that exemplify levels of performance (validity) • Consult with experts in fields to develop “authentic performance tests” and to validate task attributes and performance descriptors (validity, authenticity) • Develop performance scoring instruments collaboratively with colleagues
Final tips Continued • Gather multiple perspectives on scoring of same performance (rater reliability) • Share scoring rubric with students in advance of the performance task (equity, transparency, accountability) • Share performance exemplars (sample projects, assignments, video depictions) with students in advance of performance task (transparency, accountability) • Be prepared to review and revise assessment tool (accountability, validity)
In Conclusion Did we achieve the learning outcomes of this session? • identify tools that can be used with performance assessments, • Distinguishbetween rubrics, checklists and rating scales, • explore and share tips to guide the assessment of Performances.