400 likes | 581 Views
Engaging learning outcomes across a d iscipline and in Institutions . Brian Frank Queen’s University. Learning outcomes are not new…. …but closing the loop is. using evidence from learning outcomes to improve student learning and inform curriculum.
E N D
Engaging learning outcomes across a discipline and in Institutions Brian Frank Queen’s University
Learning outcomes are not new… …but closing the loop is using evidence from learning outcomes to improve student learning and inform curriculum Survey: Only 6% of 146 profilesof good practice submitted contained evidence that student learning had improved (Banta & Blaich, 2011). Baker, G. R., Jankowski, N. A., Provezis, S., & Kinzie, J. (2012). Using Assessment Results: Promising Practices of Institutions That Do It Well. Retrieved from http://www.tamiu.edu/adminis/iep/documents/NILOA-Promising-Practices-Report-July-2012.pdf E.g. Ontario’s college sector, professional programs in Canada, accreditation requirements in the US
800 meta-analyses50,000+ studies 200+ million students Hattie, J. (2009). The Black Box of Tertiary Assessment: An Impending Revolution. In Tertiary Assessment & Higher Education Student Outcomes: Policy, Practice & Research (pp.259-275). Wellington, New Zealand: AkoAotearoa
Role of learning outcomes in delivery: Level: Course Program Faculty Institution Learning outcomes Assessment Learning & teaching activities Curriculum and assessment planning University learning space planning University-wide student services and academic support planning Potentially: Competency based credentials
Disciplinary, national Institutional, provincial
Canadian Engineering Accreditation Board: 3.1: Demonstrate that graduates of a program possess 12 graduate attributes 3.2: Continual program improvement processes using results of graduate attribute assessment
Engineering Graduate Attribute Development (EGAD) Project WHO Engineering educators and educational developers across Canada MANDATE Collect and develop resources and training Run annual national workshops, and customized institutional workshops
EGAD Workshops • Introduction to Continuous Program Improvement Processes • Graduate Attribute Assessment as a Course Instructor • Creating Useful Learning Outcomes • What to Look for in an Outcomes-Based Process • Leading a program improvement process
Example process 1 2 Mapping curriculum and assessment planning Program objectives and outcomes What do you want to know about the program? Curriculum & process improvement Analyze and interpret Collecting evidence 5 4 3
Developing or adapting outcomes Tool: Learning outcomes collection Aligning outcomes and curriculum Tool: Curriculum map Aligning outcomes within a course Tool: Course planning table Scoring performance Tool: Rubrics
HEQCO Learning outcomes consortium Issue No one has effectively closed the loop in Ontario Consortium goal Development of useful learning outcomes assessment techniques and to their wide-scale implementation in their institutions Focus on generic learning outcomes and cognitive skills (critical thinking, communications, lifelong learning, etc.)
Developing or adapting outcomes Tool: Learning outcomes collection Aligning outcomes and curriculum Tool: Curriculum map Aligning outcomes within a course Tool: Course planning table Scoring performance Tool: Rubrics
College sector Durham: Is Student Success ePortfolio effective for assessing Essential Employability Skills (EES) George Brown: Tools & rubrics to assess EES (communication and problem solving) Humber: reliable instrument for reading, writing, critical thinking, and problem solving across curriculum
University sector Guelph: Process and tools for mapping & assessment of university-wide learning outcomes using VALUE rubrics Toronto: Analytic rubrics for communications, application of knowledge, and teamwork in engineering Queen’s: Mixed methods assessment of generic learning outcomes across four fields
Approaches to direct assessment of learning outcomes across program Course-specific criterion-referenced scoring using course deliverables Stand-alone standardized instruments General criterion-referenced scoring using course deliverables
Approaches to direct assessment of learning outcomes across program • Course specific criterion-referenced scoring using course deliverables • Provide clear guidance to students • Useful for course improvement • Limited ability to assess development over multiple years • Stand-alone standardized instruments • General criterion-referenced scoring using course deliverables
E.g. Course specific outcomes assessed using course deliverables
Approaches to direct assessment of learning outcomes across program • Course specific criterion-referenced scoring using course deliverables • Stand-alone standardized instruments (CLA, etc.) • Measure development over multiple years, institutional comparison. Validity & reliability data. • Can be expensive, measure limited set of skills • Low completion rates, poor motivation particularly fourth year students, so results suspect • General criterion-referenced scoring using course deliverables
Approaches to direct assessment of learning outcomes across program • Course specific criterion-referenced scoring using course deliverables • Stand-alone standardized instruments • General criterion-referenced scoring using course deliverables • Can assess development over multiple years • No additional student work, so no problem with motivation, completion rates • Encourages alignment between program course outcomes and course delivery • Requires some additional grading time • Limited availability of validated rubrics
Valid Assessment of Learning in Undergraduate Education (VALUE) Rubrics Meta-rubrics that synthesize the common criteria and performance levels gleaned from numerous individual campus rubrics for 14 Essential Learning Outcomes Can be used to mimic approach taken by some critical thinking tests that allow programs to provide their own “artifact” that is scored against a common set of criteria
Rhodes, Terrel, ed. 2010. Assessing Outcomes and Improving Achievement: Tips and Tools for Using Rubrics. Washington, DC: Association of American Colleges and Universities.
Assessing development using VALUE rubrics A. Greenhoot, D. Benstein, Using VALUE Rubrics to Evaluate Collaborative Course Design, Peer Review, vol. 13 no. 4, AAC&U
Engaging learning outcomes across a discipline and in Institutions If you don’t know where you’re going, you’ll probably end up somewhere else. Brian Frank Queen’s University
Other slides (used only if there are questions) Engineering Graduate Attribute Development (EGAD) Project
CEAB requirements include: • Identified learning outcomes that describe specific abilities expected of students • A mapping of where attributes are developed and assessed within the program • Description of assessment tools used to measure student performance (reports, exams, oral presentations, …) • An evaluation of measured student performance relative to program expectations • a description of the program improvement resulting from process
Performance by student in a course Engineering Graduate Attribute Development (EGAD) Project
4 approaches to facilitating change individual environmental prescribed emergent Effective strategies: are aligned with or seek to change beliefs, long-term interventions, understand university as a complex system, honest about issues and problems. Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984. doi:10.1002/tea.20439
Software packages evaluated Want to merge into one tool! Canvas Desire2Learn eLumen LiveText Moodle Waypoint Outcomes (No response from Blackboard) Engineering Graduate Attribute Development (EGAD) Project
Assessment Analytics Engineering Graduate Attribute Development (EGAD) Project
Software summary Desire2Learn is the closest to a complete package to manage courses, learning outcomes, rubrics, and reporting; Analytics tool in early stages eLumen outstanding at analysis, but poor integration into general LMS Waypoint Outcomes/LiveText outstanding at managing outcomes, rubrics, and feedback Engineering Graduate Attribute Development (EGAD) Project
Norm referenced evaluation Criterion referenced evaluation Student: You are here! (67%) Student has marginally met expectations because submitted work mentions social, environmental, and legal factors in design process but no clear evidence of that these factors Impacted on decision making. Grades Used for large scale evaluation to compare students against each other Used to evaluate students against stated criteria Engineering Graduate Attribute Development (EGAD) Project