80 likes | 87 Views
Explore the methodology for creating performance measures at course level aligned with program outcomes. Develop skills for assessing, creating, and implementing performance measures in engineering education.
E N D
Designing Course-Level Performance Measures Aligned with Program-Level Learning Outcomes Steven Beyerlein, Mechanical Engineering University of Idaho Daniel Apple, Pacific Crest Frontiers in Education Workshop Savannah, GA October 20, 2004
Workshop Outcomes • Assess a performance measure for the design process from the standpoint of a user • Experience a team process forcreating a new performance measure • Motivate participants to incorporate more performance measurementin their courses.
Schedule (Part I) • Participant Introductions=>10 min • Case Study: Performance Measure for Design(skim paper with facilitator commentary=>10 min) • Assess holistic rubric for design=>10 min • Assess analytical rubric for design=>10 min • Assess process for creating performancemeasure as documented in paper=>10 min • Select process area to study and formteams for Part II=> 10 min • Conducting Research • Managing a Client Relationship • Reading reference materials/research papers • Testing a product or a process
NEED: Shared Frameworkfor Discussing and Measuring Development of Skills Freshmen Sophomore Junior Senior Intro to Engineering and Design Discipline-Specific Courses, Including Some Projects In-Depth Analysis, Design High-Level Design and Professional Capabilities Foundational Team Design Capabilities
METHOD: Shared Language Performance - Individual/team actions that can be measured, monitored, and improved over time Performance Criteria - Expected behavior levels associated with a particular group of performers Performance Task - Integrated learning challenge that provides opportunities to demonstrate core knowledge, skills, and attitudes Performance Factor - Element of performance which can be directly observed and measured Performance Measure - Instrument that integrates relevant data for the purpose of rating performance
Methodology for Creating a Performance Measurefor a Process Area 1) Form a team with diverse training and perspectives (pg. 3) 2) Recruit a facilitator familiar with the process area (pg. 3) 3) Define boundaries of the skill set for the process area (pg. 4) 4) Analyze expert behavior in the process area (pg. 14) 5) Identify key sources of variability in performance (pg. 15) 6) Craft a holistic rubric from “novice” to “expert” (pg. 16) 7) Propose scales for measuring key factors (pg. 9-10) 8) Test by reflecting on a variety of performances (pg. 7-8)
TESTING: Three Contexts • Measurement is the foundation for quality assessment as well as quality evaluation • Assessment is low-stakes, “friendly” feedback • Measures knowledge, skill or ability to improve future performance • Provides assessee indication of strengths, areas for improvement, and ways to improve • Evaluation is high-stakes, “judgmental” feedback • Judges the merit or worth of something against a set of standards • Produces a grade or score that is part of a permanent, public record
Schedule (Part II) • Create new performance measure • initial description (2 sentences) => 10 min • list performance factors (up to 7) =>10 min • rank and report top factors =>10 min • achieve consensus on primary factors =>15 min • pair factors for use in holistic rubric => 10 min • label performance levels => 10 min • demonstrate writing behaviors => 10 min • practice writing behaviors =>15 min • Identify sub-items for analytic rubric => 10 min • Assess team process used in workshop =>10 min