130 likes | 264 Views
Using Evidence to Complete the Assessment Cycle . UK Office of Assessment. The LEARNING Initiative Office Of Undergraduate Education . Instruments (assessment, translation, from qual to quant); Benchmarks. Dissemination of results; formulation of improvement action plans. Data
E N D
Using Evidence to Complete the Assessment Cycle UK Office of Assessment
The LEARNING Initiative Office Of Undergraduate Education
Instruments (assessment, translation, from qual to quant); Benchmarks • Dissemination of results; formulation • of improvement • action plans • Data • collection • Validation • process • Prelim data • analysis
Warehousing Data is not the Goal • Assessment results must be used to improve learning • A spiraling process that involves: • Articulating outcomes and selecting benchmarks • Gathering data • Turning data into evidence • Identifying areas that need improvement • Planning improvement actions • Implementing improvement actions plans • Assessing effectiveness of improvements • Improve assessment & improvement process(es) as well as learning … and so on
Preliminary Analysis • Consider how the “story” the data is telling addresses the questions important to you • You may already have questions about whatever is being measured • Ex: Are UK undergraduates achieving college-level math competency through the Gen Ed program? • Ex: Are UK undergraduates achieving college-level information literacy as a result of Library efforts? • If you are just starting and don’t know what to ask, focus on the three basic questions: • Who are our students? • What are our students learning? • How effective is our program/unit/department?
Begin with the Basic Questions • Who are our students? • How engaged are they? How do they perceive the program? What and how much do they think they’re learning? What do they most/least value out of the program? • What are our students learning? • Direct evidence of learning in the program • How effective are we as a department, college, unit? • Enrollment numbers, usage numbers, retention/graduation rates, placement rates, employer perspectives on the program
Analyzing Data • Look for gaps, threads, patterns in the evidence • Performance • Satisfaction • Enrollment • Retention/Persistence • Perception • Ask how the gaps or patterns address your questions about the program • Look at the whole data “picture” • One data set out of context can mean very little – and can lead you to a wrong conclusion
Audience Awareness • Who are the audiences for assessment results, and what kind of results does each need? • For what main purpose will each audience use assessment results?
What Kind of Improvements Can be Made? • Better student learning experiences • Increased student learning • Increased program/unit/department institutional efficiency • Increased program/unit/department institutional effectiveness
One More Thing … • Please fill out the Workshop Evaluation Form in your folder Thanks!