460 likes | 601 Views
Student Learning Outcomes. Los Angeles Valley College Training, Spring 2008 – part II SLO Coordinator – Rebecca Stein steinrl@lavc.edu ; (818) 947-2538. Review - Why SLOs?. New Accreditation Standards “Covering” material does not guarantee students have learned it
E N D
Student Learning Outcomes Los Angeles Valley College Training, Spring 2008 – part II SLO Coordinator – Rebecca Stein steinrl@lavc.edu; (818) 947-2538
Review - Why SLOs? • New Accreditation Standards • “Covering” material does not guarantee students have learned it • Success is determined by students leaving a course/program with integrated, higher learning skills they can demonstrate • Establishes clear and transparent expectations for students
Review - What is this thing called SLO? • SLO means Student Learning Outcome. • They represent broad themes beyond specific course content. • They cut across the curriculum. • They are measurable or observable.
Objectives Tied directly to specific course content. Address skills, tools, or content that enable a student to engage in a particular subject. 5 – 7 per course. Outcomes Overarching understanding and application beyond specific course content. What students take away from the course that they can use in other courses or in life. 1 – 2 per course. Review - How is an SLO different from an objective?
What’s Assessment All About? • An ongoing process aimed at understanding and improving student learning. • Faculty making learning expectations explicit and public. • Faculty setting appropriate standards for learning quality.
What is Assessment All About? • Systematically gathering, analyzing and interpreting evidence to determine how well student performance matches agreed upon faculty expectations and standards. • Using results to document, explain and improve teaching and learning performance. Tom Angelo AAHE Bulletin, November 1995
Roles of Assessment • “Assess to assist, assess to advance, assess to adjust” • Assist – provide formulative feedback to guide student performance • Advance – summative assessment of student readiness for what’s next • Adjust – continuous improvement of curriculum, pedagogy Ruth Stiehl (2007)
Questions for Assessment • What do students need to DO “out there” that we are responsible for “in here”? (Stiehl) • How do students demonstrate the intended learning now? • What kinds of evidence must we collect and how do we collect it?
The Assessment Smorgasbord • When SLOs are well-written, the method of assessment is often clear. • One-size doesn’t fit all! • To select appropriate tools, need to understand: • Types of tools available • Nature of the data • Potentials and limitations of each tool
Quality Data • Quality data: • Based upon best practices • Answer important questions • Benefit the students & institution by providing evidence to complete loop • The assessment loop is a data-driven method of decision-making. • Questions are posed concerning what works and what does not.
Quality Data: Are Results Valid and Reliable? • Valid - the data accurately represents what you are trying to measure. For instance the numbers of people that graduate don't necessarily represent good data on what has actually been learned. • Reliable - the data are reproducible. Repeated assessment yields the same data. • Authentic - the assessment simulates real-life circumstances. • Relevant - the data answers important questions, and is not generated simply because it is easy to measure. • Effective - the data contributes to improving teaching and learning.
Direct What can the student do or actually demonstrate they know Can witness with own eyes Setting is structured/ contained Indirect What students say they can do Things from which learning is inferred Setting is not easily structured/ contained Direct vs. Indirect
Qualitative Words Broad emergent themes Holistic judgments Bulky to store and report Often most valuable and insightful Quantitative Numbers Individual components and scores Easier calculations and comparisons Easy to store and manage Often has limited value Must be carefully constructed to be valid Qualitative vs. Quantitative
Formative Assessment for learning “In-progress” Provide corrective feedback Establish foundational learning for next step Summative Assessment for evaluative purposes “After the fact” Determine progress/ achievement/ proficiency Readiness for next step/role/learning experience Formative vs. Summative
Criterion-based Evaluated/scored using set of criteria Based on proficiency not subjective measures such as improvement Norm-referenced Assessment of individual compared to other individuals or individual’s improvement over time Rank, median Addresses overall mastery but gives little detail about specific skills Criterion-based vs. Norm-Referenced
Standardized Assessments created, tested, sold by an educational testing company Usually scored normatively Homegrown/Local Developed and validated for a specific purpose, course, function Usually criterion-referenced to promote validity Standardized vs. Homegrown
Embedded Assessments • Occurs within regular class or curricular activity • Class assignments linked to SLOs • Individual questions on exams can be embedded in numerous classes • Immediate feedback
Grading vs. Assessing • A course grade is based on student achievement of course objectives. • It is possible for a student to pass a class but not meet a specific course outcome and vice versa. • Various assessment techniques can be used in a class that may or may not be part of a grade.
Grading vs. Assessing • What would we look at to grade this assignment? (columns) • What would we look at for assessment? (rows)
Assessment Activity/Assessment Measure • Need to address two components: • Assessment activity – what will students do to show you they have achieved the SLO • Assessment measure – how will instructors evaluate what the students have done
Assessment Activity Examples • Licensing Exams (e.g., Nursing) • Standardized Tests • Reflective Self-Assessment Essay • Satisfaction/perception surveys (student, faculty, staff, employer, community)
Case Study & Problem Solving • Use an “in situ” approach to simulate real life situations and problems.
Flowchart or Diagram • Visual/graphic illustration of a process or system. • High level cognitive achievement requiring analysis and synthesis. • “Draw a flowchart for whatever you do. Until you do, you do not know what you are doing, you just have a job.” (W.E. Deming, quality guru)
Capstone • Capstone = a culminating event or crowning achievement • Capstone courses/projects
Portfolios • Portfolios are a collection of student work over a period of time, usually including student reflection on their achievement. • Have strengths/weaknesses; ask yourself if it will work for you • ePortfolios
Assessment Measures • Checklist • Rubric • Calibrated Peer Review
Checklists • Determines whether a criterion is present or not. Good for simple psychomotor skills or low level recall. • Example: Hand Washing Checklist
Rubrics • A rubric • is "a scoring tool that lists the criteria for a piece of work or 'what counts.' " (Heidi Goodrich) • describes levels of quality for each of the criteria, usually on a point scale • makes your expectations clear to students. • reduces the time you spend grading student work and makes it easier for you to explain to students why they got the grade they did and what they can do to improve • are most effective when you provide students with actual examples of poor, average, and good work
Calibrated Peer Review • “Calibrated Peer Review (CPR)™ is a Web-based program that enables frequent writing assignments even in large classes with limited instructional resources. In fact, CPR can reduce the time an instructor now spends reading and assessing student writing.” • “CPR offers instructors the choice of creating their own writing assignments or using the rapidly expanding assignment library. Although CPR stems from a science-based model, CPR has the exciting feature that it is discipline independent and level independent.”
Assessing Program-Level SLOs • Licensing/Employment/Transfer • Capstone Courses or Projects • Student Surveys • Portfolios
Create an Assessment Tool • Look at the SLOs for your course. • Are there any assignments that provide good data on outcomes? • If not, you need to create one!
Create an Assessment Tool • Determine which type of assessment tool would best assess that students can DO the outcome • Should be authentic – closely resembling a real life experience • Will the student perform a task, create a product, analyze a case study, solve a problem?
Identify the Purpose of the Assessment • Will it be formative or summative? • If formative – how will feedback be given? • If summative – will the student have ample practice and feedback to do what is expected?
What is a Successful Outcome? • Identify the major traits that determine a successful outcome • Describe the criteria relating to the traits and create a checklist, rubric or set of descriptive performance standards • Set criteria at the appropriate level of thinking (Bloom’s taxonomy)
Create an Assessment Tool • Try out your assessment on student work and make appropriate modifications. • Share the tool with other faculty and get feedback.
Online Resources • Calibrated Peer Review: http://cpr.molsci.ucla.edu • ePortfolios: http://eportfolio.org, http://www.osportfolio.org • Hot Potatoes: http://hotpot.uvic.ca/ • Rubrics: http://rubistar.4teachers.org. http://landmark-project.com/ rubric_builder/index.php, http://rubrics.coastline.edu, http://school.discoveryeducation.com/schrockguide/assess.html
Online Resources • Internet Resources for Higher Education Outcomes Assessment: http://www2.acs.ncsu.edu/UPA/assmt/resource.htm
CLOSING THE LOOP The Student Learning Outcomes Assessment Cycle
Develop, modify, or review a curriculum, course, program, or service. Develop Student Learning Outcomes (SLOs) Determine refinements based on data. Closing the Assessment Loop Design & Measure Student Learning as a result of the Curriculum, Course, or Program. Collect, discuss, and analyze data. The Assessment Cycle (SLOAC)
Reporting the SLOAC • Goal – to assess every course and program in your discipline within the five-year program review cycle. • Annual reporting and Program Review reporting. • Report includes: • SLO and assessment methods used • Assessment Results • How results were used for improvement of the course or program
The Paper Trail • Course and Program SLO forms need a Department Approval form. • Submit to Erline Ewing in Academic Affairs (for VCCC approval). • Other areas submit to area coordinator: • Student Services – Walter Jones • Administrative Services – Brick Durley • President’s Office – Cherine Trombley