620 likes | 771 Views
Program development process at Queen’s University to demonstrate graduate attributes Brian Frank Director (Program Development) Faculty of Engineering and Applied Science Queen's University. Focus.
E N D
Program development process at Queen’s University to demonstrate graduate attributes Brian Frank Director (Program Development) Faculty of Engineering and Applied Science Queen's University
Focus “The institution must demonstrate that the graduates of a program possess the attributes under the following headings... There must beprocesses in place that demonstrate that program outcomes are being assessedin the context of these attributes, and that theresults are applied to the further development of the program.”
CEAB Instructions Describe the processes that are being or are planned to be used. This must include: • a set of indicators that describe specific abilities expected of students to demonstrate each attribute • where attributes are developed and assessed within the program… • how the indicators were or will be assessed. This could be based on assessment tools that include, but are not limited to, reports, oral presentations, … • evaluation of the data collected including analysis of student performance relative to program expectations • discussion of how the results will be used to further develop the program • a description of the ongoing process used by the program to assess and develop the program as described in (a)-(e) above Engineering Graduate Attribute Development (EGAD) Project
Approach • Short term objectives (2010-2011): • Set up a comprehensive process limited to a small number of courses to help programs understand the process • Use data to help faculty see value in outcomes assessment for program improvement • Long term: • Comprehensive assessment of all attributes throughout programs • Evaluate validity of data • Students take responsibility for demonstrating some attributes
Queen's University timeline • Summer 2009: Working groups of faculty, students, topical experts created specific program-wide indicators (next slide, and in Appendix 3.1A) • Summer 2009: Setup learning management system (Moodle) to manage assessments • Sept 2009-April 2010: Piloted assessment in first year • Sept 2010-April 2011: Piloted assessment in first year, faculty wide second year, and fourth year (common across programs) • April – July 2011: Student surveys and focus groups, curriculum mapping, data analysis Curriculum planning happening throughout
Why initial emphasis on first year? • First year is faculty-delivered, core to all students • Provides opportunity to pilot a process • Help disseminate outcomes assessment procedures to other instructors • Long term: assessment process continue in first year program to inform development
Aside: Idealistic course development process Overall Improvement Identify course objectives and content Create and Execute a Plan Student input Create specific outcomes for each class Analyze and evaluate data Deliver, grade, seek feedback Map to experiences (lectures, projects, labs, etc.) Identify appropriate tools to assess (reports, simulation, tests,...) Engineering Graduate Attribute Development (EGAD) Project
Program-wide assessment process flow Create a Program Improvement Plan Defining Purpose and Outcomes Program & Course Improvement Analysis and Interpretation Stakeholder input Identifying and Collecting Data Program Mapping Engineering Graduate Attribute Development (EGAD) Project
Human capital • Director, Program Development to manage process • Faculty member from each program • Other experts as appropriate (economics, information management, etc.) Currently separate from faculty-wide curriculum development committee
Resources/time commitment • Creating assessment criteria: 7 committees of approximately 5 people who each met about 4 times • Mapping criteria to a course and creating rubrics for assessment: ~ 10 hours • Large scale curricular changes: ~10 person committee, most of whom had 1 course relief bought out by dean • Coordination (resource gathering, planning, curricular planning): ~30% of a position
Academic and curricular structure Dean Faculty-wide curriculum committee Associate Dean (Academic) Dean’s Retreat Curriculum Review Committee (DRCRC) Director (Program Development) Graduate attribute assessment committee NSERC Design Chair DuPont Canada Chair in Engineering Education
What are indicators? Lifelong learning An ability to identify and address their own educational needs in a changing world in ways sufficient to maintain their competence and to allow them to contribute to the advancement of knowledge Can this be directly measured? Would multiple assessors be consistent? How meaningful would the assessment be? Probably not, so more specific measurable indicators are needed. This allows the program to decide what is important Engineering Graduate Attribute Development (EGAD) Project
Indicators: examples Lifelong learning An ability to identify and address their own educational needs in a changing world in ways sufficient to maintain their competence and to allow them to contribute to the advancement of knowledge Graduate attribute The student: Critically evaluates information for authority, currency, and objectivity when referencing literature. Identify gap in knowledge and develop a plan to address Indicators Uses information ethically and legally to accomplish a specific purpose Describes the types of literature of their field and how it is produced Engineering Graduate Attribute Development (EGAD) Project
Establishing Indicators Level of expectation (“describes”, “compares”, “applies”, “creates”, etc.) Content area • A well-written indicator includes: • what students will do • the level of complexity at which they will do it • the conditions under which the learning will be demonstrated Critically evaluates information for authority, currency, and objectivity in reports. context Engineering Graduate Attribute Development (EGAD) Project
Assessment criteria Graduate attribute Linkage to OCAV UDLEs levels categories Engineering Graduate Attribute Development (EGAD) Project
Rubric example • Creating defined levels (“scales”) of expectations reduces variability between graders, makes expectations clear to students threshold target
Sample First year indicators for problem analysis and design Engineering Graduate Attribute Development (EGAD) Project
Sample fourth year indicators for Problem analysis and Design Engineering Graduate Attribute Development (EGAD) Project
Program-wide assessment process flow Create a Program Improvement Plan Program & Course Improvement Defining Purpose and Outcomes Analysis and Interpretation Stakeholder input Identifying and Collecting Data Program Mapping Engineering Graduate Attribute Development (EGAD) Project
Student surveys and focus groups • Provides student input: • implementing attribute assessment in program • perceptions on where attributes are developed within the program as complement to curriculum mapping via faculty survey • perception of importance within program
Questions • What do you think are priorities within the program? • What courses contribute to development of attribute {}? • Which attributes are difficult to demonstrate? • How would you recommend that attributes be developed?
Self reported demonstration at program entry • Top five Grad Attributes where students reported a rating of 2 or 3 (yes or to a great degree) out of 3 • Individual and Team Work 88.73% • Communication Skills 78.17% • Professionalism 69.02% • Problem Analysis 61.26% • Investigation 60.56% Potential for students to perceive little value in learning activities directed toward developing these attributes
First year program supports: Attributes in students’ top five responses Individual and Team Work* 94.97% Knowledge Base in Engineering 93.53% Problem Analysis* 93.53% Professionalism* 85.58% Investigation* 82.48% Design 80.58% Impact of Engineering on Society 80.58% *Identified as a strength coming in to the program
First year program supports Bottom three responses Ethics and Equity 64.03% Economics and Project Management 69.56% Lifelong Learning 73.19% These three are a significant focus in APSC-100, embedded in various activities.
Focus group suggestions • Communicate graduate attributes and draw attention back to them • What is lifelong learning”? • Professionalism and ethics and equity should be focused on in upper years
Program-wide assessment process flow Create a Program Improvement Plan Defining Purpose and Outcomes Program & Course Improvement Analysis and Interpretation Stakeholder input Identifying and Collecting Data Program Mapping Engineering Graduate Attribute Development (EGAD) Project
Analyze and evaluate… • Histogram of results by level (did or did not meet expectations) • Histogram of results by student (how many indicators did each student fall below • Trend over time • Triangulation: examination of correlation between results on multiple assessments of the same indicator data with exam results)
Knowledge base: Mathematics Calculus instructor asked questions on exam that specifically targeted 3 indicators for “Knowledge”: • “Create mathematical descriptions or expressions to model a real-world problem” • “Select and describe appropriate tools to solve mathematical problems that arise from modeling a real-world problem” • “Use solution to mathematical problems to inform the real-world problem that gave rise to it”
Indicator 1: • The student can create and/or select mathematical descriptions or expressions for simple real-world problems involving rates of change and processes of accumulation (overlaps problem analysis) Context: calculating Intersection of two trajectories
Indicator 2: Students can select and describe appropriate tools to solve the mathematical problems that arise from this analysis Context: differentiation similar to high school curriculum
Indicator 2: • Students can select and describe appropriate tools to solve the mathematical problems that arise from this analysis Context: implicit differentiation, trig inverse
Program-wide assessment process flow Create a Program Improvement Plan Defining Purpose and Outcomes Program & Course Improvement Analysis and Interpretation Stakeholder input Collecting Data Program Mapping Engineering Graduate Attribute Development (EGAD) Project
Graduating year • Starting point: histograms • Very few students falling below threshold level in capstone courses for most indicators
Area for improvement in graduating year: technical literature
Data evaluation • Across multiple capstone courses, students scoring lower on indicators involving: • Evaluating validity of results • Evaluating techniques and tools • Evaluating effectiveness of results • Evaluating information • Pattern: evaluation
Curriculum Mapping: CurriKit • Curriculum mapping software developed by U Guelph • Provides information to identify: • the courses which develop each graduate attribute • what assessment is done and when • which instructional approaches are used
Program-wide assessment process flow Create a Program Improvement Plan Defining Purpose and Outcomes Program & Course Improvement Analysis and Interpretation Stakeholder input Collecting Data Program Mapping Engineering Graduate Attribute Development (EGAD) Project