140 likes | 255 Views
Annual Learning Outcomes Assessment Reporting Joe Slowensky Vice Chancellor for Institutional Effectiveness and Faculty Affairs, ALO Chapman University. Why must we do this?. Accountability in Higher Ed Federal Government / Dept. of Education WASC – one of six regional agencies
E N D
Annual Learning Outcomes Assessment ReportingJoe SlowenskyVice Chancellor for Institutional Effectiveness and Faculty Affairs, ALOChapman University
Why must we do this? • Accountability in Higher Ed • Federal Government / Dept. of Education • WASC – one of six regional agencies Gatekeepers to Accreditation and FINANCIAL AID $$$
Assessment • Ongoing process designed to monitor and improve student learning (Allen, 2004)
Steps in Assessment Planning:Process, Performance & Progress • Develop student learning outcomes (SLOs) • Align curriculum with SLOs (curriculum map) • Identify sources of evidence and create sustainable plan for evaluation • Collect and analyze the data; were targets met? • Use results to make improvements • See if it worked (closing the loop)
I. PROCESS • Outcome Description • Where is the outcome published for students? • Evidence of Learning • Assessing the Evidence • Expected Level of Achievement
I. PROCESSLearning Outcome Statements • Describe what students are expected to learn or be able to do. • Are student-centered, not input-or teaching-centered. Why don’t grades count? Or # degrees awarded?
Evidence of Learning: Identifying Sources • DIRECT • absolute, concrete artifacts that demonstrate student learning or achievement • INDIRECT • opinion-based; when you have to rely on indicators
Direct • Authentic projects (real world) • ETS exams • Capstone or thesis projects • Portfolios • Course-embedded assignments • Signature assignments • Performances, exhibitions
Indirect • Student or alumni surveys • Focus groups • Interviews • Reflective essays
Best Practice = Use Multiple Measures • (Direct + Indirect) Triangulate data to draw conclusions
Assessing the Evidence • Rubrics • Standardized tests • Portfolios • Sample sizes • Who did the assessing?
Expected Level of Achievement • Should be discussed and defined by faculty • Should be specific and concrete • Avoid vague references like, “We want the majority of our students to be satisfactory...” • Should be based on rubric data • May be presented as a numerical average, a percentage, a range, a descriptor, etc. • Data aggregated and disaggregated as necessary • May feel intuitive or arbitrary at first • Targets should be reevaluated after each report
II. Performance • Compare actual data from current year’s student performance levels to Expected Level of Achievement in I. Process • Provide some preliminary analysis • Describe your system for sharing the assessment results with faculty and the plan for making sure they analyze the data and make changes, if necessary
III. Progress: Closing the Loop • How do you use what you find? • Demonstrate how you have improved student learning as a result of assessment. • Describe changes you have made based on previous year’s assessment • Reference Expected Levels of Achievement and cite data from previous year’s and current year’s Performance levels to show the effects of those changes • Provide analysis of these results in the PROGRESS section of annual report.