140 likes | 332 Views
Developing Assessment Plans. Office of Assessment. The LEARNING Initiative Office Of Undergraduate Education . The Basic Elements of an Assessment Plan. Mission statement Outcomes that support student learning
E N D
Developing Assessment Plans Office of Assessment
The LEARNING Initiative Office Of Undergraduate Education
The Basic Elements of an Assessment Plan • Mission statement • Outcomes that support student learning • Mapping document: maps support for student learning outcomes to activities or learning experiences • Explanation of who is responsible for assessment • Description of unit assessment methods and procedures • Clear articulation of assessment cycle
SACS Requirements & Expectations for Evidence of Student Learning • Evidence that supports compliance must be: • Reliable • Current • Verifiable • Coherent • Objective • Relevant • Representative • Entail interpretation and reflection • Represent a combination of trend and snapshot data • Draw from multiple indicators
General Definitions of Evidence • Information that tells you something directly or indirectly about the topic of interest • Evidence is neutral -- neither “good” nor “bad” • Requires context to be meaningful • Two types of assessment evidence • Indirect methods measure a proxy for student learning • Direct methods measure actual student learning • “Learning” = what students know(content knowledge) + what they can do with what they know (performance)
Direct Evidence • Should be the core evidence of learning in all programs/units • Students show achievement of learning goals through performance of knowledge, skills: • Scores and pass rates of licensure/certificate exams • Individual research projects, presentations, performances • i.e., substantial course assignments that require performance of learning • Collaborative (group) projects/papers which tackle complex problems • Theses and dissertations • Ratings of skills provided by internship/clinical supervisors • Portfolios
Indirect Evidence • Provides information on the context for learning in the program • Indirect methods measure proxies for learning • Data from which inferences can be made about learning, but do not demonstrate actual learning, such as perception or comparison data • Surveys • Student opinion/engagement surveys • Student ratings of knowledge and skills • Employers and alumni, national and local • Focus groups/Exit interviews • Institutional performance indicators • Enrollment data • Retention rates, placement data
Developing Measures of Effectiveness • Intentional Planning • Determine assessment points: what activities and/or learning experiences are logical places to collect data? • Program/unit entry, mid-point (if possible), exit • Identify assessable activities and/or learning experiences that measure stated student learning outcomes
Developing Measures of Effectiveness - Continued • Design performance-based assignments/instruments when possible • Ex: Written work, oral presentations, assignments using open-ended questions … • Identify opportunities for processing results and formulating improvement action plans
Tips for Creating a Practical Assessment Cycle • The cycle must be long enough to assess every outcome • The cycle ought to be short enough that students can enter UK, be assessed early in their educational experience, and still enjoy the benefit of assessments they participated in • For example, a 3-year cycle for a program/unit with 5 learning outcomes • Don’t try to assess everything all the time • Plan on assessing 1-2 outcomes per year • Don’t try to assess all students • Plan to use a representative sample • If you’re going to disaggregate by level, then the sample must be stratified
Activity • Use the Assessment Plan Template in conjunction with the Assessment Plan Review Checklist and the program/unit information you brought with you to begin drafting an Assessment Plan for your program or unit
One More Thing … • Please fill out the Workshop Evaluation Form Thanks!