330 likes | 502 Views
Refining Your Assessment Plan: Session 4. Use of the Results Ryan Smith and Derek Herrmann University Assessment Services. Refining Your Assessment Plan: Session 4. Introductions. Introductions. Ryan Smith, Director, UAS Derek Herrmann, Coordinator, UAS Assessment Advisory Council (AAC)
E N D
Refining Your Assessment Plan: Session 4 Use of the Results Ryan Smith and Derek Herrmann University Assessment Services
Refining Your Assessment Plan: Session 4 Introductions
Introductions • Ryan Smith, Director, UAS • Derek Herrmann, Coordinator, UAS • Assessment Advisory Council (AAC) • Assessment Academy Team
Introductions • Who you are • In which Department/School you work • Why you are here
Refining Your Assessment Plan: Session 4 Overview of PRAAP
Overview of PRAAP • Process for the Review of Academic Assessment Plans • Two years before Program Review and one year before Self-Study • AAC members and UAS staff • Use an established rubric
Refining Your Assessment Plan: Session 4 Use of the Results
Use of the Results • Who is involved? • Chairperson/Director (or Assistant/Associate) • Program Coordinator • Faculty • Individuals • Committees
Use of the Results • Where is it done? • Department/School meetings • Retreats (general or assessment) • Informal conversations
Use of the Results • Examining the data • Rubrics • A scoring guide; a list or chart that describes the criteria used to evaluate (or grade) completed student assignments • Using a rubric to evaluate (or grade) assignments… • Makes your life easier • Improves student learning
Use of the Results • Holistic rubric: short narrative descriptions of the characteristics of different levels of work
Use of the Results • Checklist rubric: simple list indicating the presence of the things you’re looking for in an assignment
Use of the Results • Rating scale rubric: checklist with a rating scale added to show the degree to which the things you’re looking for are present in an assignment
Use of the Results • Descriptive rubric: replaces the checkboxes of rating scale rubrics with brief descriptions of the performances that merit each possible rating
Use of the Results • Examining the data • Content analysis • Involves making sense of the material being reviewed • Summarize common themes and the extent of consensus concerning those themes • Should be approached with an open mind and a willingness to “hear” what respondents are saying
Use of the Results • Examining the data • Content analysis • Should begin with a clear understanding of the goals associated with the review • Sometimes coding categories are predetermined; sometimes, they emerge as materials are reviewed
Use of the Results • Inter-rater reliability • Indicates the extent of agreement among different reviewers • Generally want to verify that scores based on subjective judgments are reliable (consistent)
Use of the Results • Inter-rater reliability • Two approaches • Have at least two people independently review data, compare notes, and come to agreement • Have reviewers work together on all decisions, collaborating as they go to resolve discrepancies
Use of the Results • Establishing standards • Generally set own standards • Should take the campus and program mission into account when establishing standards
Use of the Results • Establishing standards • Sometimes faculty want to know how their students compare to students on other campuses • Can set expectations based on the relative performance of their students
Use of the Results • Summarizing the results • Frequency table: a visual depiction of data that shows how often each value occurred
Use of the Results • Summarizing the results • Frequency table: a visual depiction of data that shows how often each value occurred • Measures of central tendency • Mean: arithmetic average • Median: 50th percentile • Mode: most common score
Use of the Results • Summarizing the results • Qualitative summary: “The majority of respondents indicated that they hold positive attitudes toward their degree program.”
Use of the Results • Sharing the results • Be open, honest, balanced, and fair • Understand your audiences and their needs • Help your audiences see the big picture
Use of the Results • Sharing the results • Celebrate good results • Address areas needing improvement • Goals and/or outcomes? • Direct and/or indirect evidence? • Curriculumand/or instruction? • Incorporate results into planning and decision-making processes
Refining Your Assessment Plan: Session 4 Final thoughts
Final Thoughts • Goals and learning outcomes define how students demonstrate their mastery of what faculty want them to learn • A cohesive curriculum is aligned with goals and learning outcomes
Final Thoughts • Assessment involves collecting direct and indirect evidence concerning student development • Embedding assessment in the curriculum has many advantages • Assessment should not focus on individual students or faculty
Final Thoughts • A recurring theme in assessment is collaboration • Three major criteria apply to assessment; it should be meaningful, manageable, and sustainable
Refining Your Assessment Plan: Session 4 Questions, comments, and discussion