250 likes | 376 Views
Navigating the Process of Student Learning Outcomes: Development, Evaluation, and Improvement. Shannon M. Sexton, Julia M. Williams, & Timothy Chow Rose- Hulman Institute of Technology. Introduction & Workshop Goals. Introduction Goals
E N D
Navigating the Process of Student Learning Outcomes: Development, Evaluation, and Improvement Shannon M. Sexton, Julia M. Williams, & Timothy Chow Rose-Hulman Institute of Technology
Introduction & Workshop Goals • Introduction • Goals • Discuss the process of developing institute student learning outcomes as well as common difficulties. • Assist participants in developing their own learning outcomes • Discuss evidence that can be used to evaluate student learning outcomes achievement. • Assist participants in identifying articles of evidence for their learning outcomes. • Discuss using an e-portfolio as a means of collecting and rating evidence. • Allow participants to rate sample evidence using a sample rating rubric. • Discuss how to use the rating results to close the loop and improve student learning. RHIT/IRPA - NASPA 2009
Group Activity #1 • Define Teamwork • Include characteristics of teamwork • Include traits of those characteristics • Time allotted 10 minutes RHIT/IRPA - NASPA 2009
About Rose-Hulman • Terre Haute, Indiana • 1800+ undergraduate students • B.S. degrees in engineering, science, and mathematics • 80%+ engineering students • Accredited through ABET and North Central RHIT/IRPA - NASPA 2009
CASO • Commission on the Assessment of Student Outcomes • Faculty committee with 1 representative from each academic department • Regular meetings throughout the academic year • Tasked with developing Institute-wide student learning outcomes and a way to measure them RHIT/IRPA - NASPA 2009
Historic Timeline • 1996 – 1998 • 1999 – 2000 • 2001 – 2006 • 2007 – present RHIT/IRPA - NASPA 2009
Removing the Jargon • Domains = Categories of skills • Learning Outcomes = Definition of the skills students should have • Performance Criteria = Characteristics of the learning outcome • Rubrics = Use of a scale to identify traits in a document RHIT/IRPA - NASPA 2009
Domains and Performance Criteria 8 RHIT/IRPA - NASPA 2009
Developing Outcomes • Importance of preliminary research • Teams of faculty developers • Developing in steps & gaining institute buy-in • Importance of measurability RHIT/IRPA - NASPA 2009
Writing Outcomes – Activity #2 • Revise your teamwork definition into a student learning outcome with rubrics • Time allotted 15-20 minutes RHIT/IRPA - NASPA 2009
Take a Break 10 minutes to stretch your legs RHIT/IRPA - NASPA 2009
Evidence of Outcomes • Choosing appropriate evidence • New vs. Current • Importance of Rubric • Collecting evidence • How to physically collect & store evidence • Course Mapping • Tracking where submissions are coming from • Accountability RHIT/IRPA - NASPA 2009
Criterion Description and Rubric 13 • Teamwork B1: Demonstrate how you reached a decision as a team. Primary Traits: A passing submission for this criterion must: 1. Describe the team goal.2. Provide a description of a specific team decision and describe the following: - the process of making the decision - how multiple team members contributed to the outcome - how team members’ ideas were critically evaluated3. Show that the ultimate outcome was consistent with the team’s decision-making process.Potential documents: Documents appropriate for this criterion include (but are not limited to): Memo or reflective statement on team process from a lab group design team, debate team, service project. Additional information: RHIT/IRPA - NASPA 2009
Course Mapping 14 Course mapping RHIT/IRPA - NASPA 2009
Identifying Evidence – Activity #3 15 • Where do you already have existing assignments that could be used as articles of evidence for the Teamwork outcome? • Group Brainstorming Activity RHIT/IRPA - NASPA 2009
Domains and Performance Criteria 16 C-Level Criteria B-Level Criteria A-Level Criteria RHIT/IRPA - NASPA 2009
Collecting Evidence • Use of an E-portfolio system • Faculty input • Student input • Assessment staff input RHIT/IRPA - NASPA 2009
Rating Evidence • Rating preparation • Administrative side of rating • Paying faculty volunteers, tech support, number of volunteers, etc. • Process of rating days RHIT/IRPA - NASPA 2009
Portfolio Rating Methodology • Evaluation rubric, A, B, C • Inter-rater reliability: initial, subsequent • Document rating: Pass/Fail/Exemplary • Wholistic rating, NOT grading RHIT/IRPA - NASPA 2009
Rating Documents – Activity #4 • Rate 2-3 teamwork documents using the provided rubric. • 15 - 20 minutes RHIT/IRPA - NASPA 2009
Using Rating Results • Identifying constituents • How to communicate results to institute • What to do with the results RHIT/IRPA - NASPA 2009
Example Result RHIT/IRPA - NASPA 2009
Please contact us for more information after the conference. RHIT/IRPA - NASPA 2009