410 likes | 546 Views
Assessing Outcomes: How Can We Tell If Our Students and Programs Measure Up?. Barbara Masi MIT School of Engineering June 5, 2009. Curriculum design and assessment. Questions for faculty
E N D
Assessing Outcomes: How Can We Tell If Our Students and Programs Measure Up? Barbara Masi MIT School of Engineering June 5, 2009
Curriculum design and assessment Questions for faculty • How can engineering programs create a curriculum design that addresses both content and student outcomes (what students can do, not just what they know)? • Are subjects aligned with overall curriculum design? • Do we have ways of knowing what our students know and can do at graduation other than grades? Or, how can we use student grades in a manner that provides additional information on student learning? • How can programs meet the new engineering program accreditation requirements of CEAB?
Today • Designing and assessing programs using tailored competencies. • Designing and assessing programs using ABET or CEAB-oriented program learning outcomes. • Creating a program assessment plan.
STEPS FOR CREATING AN ASSESSMENT PLAN • Develop/ revise program level learning outcomes or competencies. • Align/ realign program curriculum and other important education processes (internships etc.) with program learning outcomes or competencies. Check for sufficient coverage of all learning outcomes. • Choose/ revise assessment methods for measuring each performance criterion • Develop/ redevelop measurable performance criteria for each outcome • Determine plan (who/ how often) for review of assessment data • Determine feedback channels for reporting recommendations for improvement. • Work with instructors to ensure that needed curriculum improvements are addressed in a timely manner.
Content-based versus outcome-based education Content-based Outcome-based • At graduation, students will be able to: • Apply knowledge of engineering in problem solving and design. • Carry out steps in the engineering design process. • Contribute effectively as a member of a multi-disciplinary technical team. Thermodynamics Fluid dynamics Combine systems Measurement/ instrumentation lab
Outcome- based education • Standards-based education, or performance-based education. • Shift focus from content-driven curriculum to student performance on program learning outcomes. • Curriculum design based on what student can do at point of graduation rather than what student knows. • Fosters more authentic forms of assessment (eg. an instructor would examine how a student completes a math problem to demonstrate math ability rather than just looking at whether the student has given the right answer.) • Encourages decision making regarding all aspects of education experience
ABET Engineering Criteria 2000 and CEAB 3.1 Graduate Attributes • Requires that programs demonstrate graduate achievement of required set of program learning outcomes (ABET’s Criterion 3(a-k), CEAB 3.1 Graduate Attributes 3.1.1-3.1.12). • Requires that programs show a continuous improvement process to review assessment data and improve educational program to address potential weak areas. • Similar model recently adopted by CEAB.
ABET Criterion 3 Program learning outcomes • an ability to apply knowledge of mathematics, science, engineering and technology • an ability to design and conduct experiments, as well as to analyze and interpret data • an ability to design a system, component, or process to meet desired needs • an ability to function on multi-disciplinary teams • an ability to identify, formulate, and solve engineering problems • an understanding of professional and ethical responsibility • an ability to communicate effectively • the broad education necessary to understand the impact of engineering solutions in a global and societal context • a recognition of the need for, and an ability to engage in life-long learning • a knowledge of contemporary issues • an ability to use the techniques, skills, and modern engineering tools necessary for engineering and technology practice
CEAB 3.1 Graduate Attributes • 3.1.1 A knowledge base for engineering: ABET 3a • 3.1.2 Problem analysis: ABET 3e • 3.1.3 Investigation: ABET 3b • 3.1.4 Design: ABET 3c • 3.1.5 Use of engineering tools: ABET 3k • 3.1.6 Individual and team work: ABET 3d • 3.1.7 Communication skills: ABET 3g • 3.1.8 Professionalism: ABET 3f • 3.1.9 Impact of engineering on society and the environment: ABET 3h • 3.1.10 Ethics and equity: ABET 3f • 3.1.11 Economics and project management: no match • 3.1.12 Life-long learning: ABET 3i
OutcomesAbilitiesCompetencies • Outcomes are statements that describes what students are expected to know or be able to do by the time of graduation from a program. • ABET outcomes, as stated, are complex abilities. Abilities are complex combinations of competencies. • Competencies are the application of behavior and motivation to knowledge, understanding and skill. • Key actions that demonstrate workplace competencies can be measured.
Competencies in engineering programs- sampling • Technical and behavioral competencies • Technical Competencies are predominately about acquired knowledge and technical abilities and skills. • Behavioral competencies, such as communication skills or team member skills. These competencies can be harder to see and develop but are key indicators of how an individual approaches his/her work. • Technical (Engineering) Knowledge • 1. Thermodynamics • 2. Fluid Mechanics • Thinking Skills • 3. Analytical Thinking and Decision Making • 4. Project Planning & Organizing • Personal/Professional Effectiveness • 5. Communications Skills • 6. Professional Conduct • Team Skills • 7. Team Member Skills
Sample competency: analysis and decision making • Definition: Identifying/ understanding issues, problems, and opportunities; comparing data from different sources to draw conclusions; using effective approaches for choosing course of action or developing appropriate solutions; taking action consistent with available facts, constraints, and consequences. • Key Actions • ADM1.- Identifies issues, problems and opportunities. Recognizes issues, problems, or opportunities and deter-mines whether action needed. • ADM2.- Gathers information. Identifies the need for and collects information to better understand issues, problems, and opportunities. • ADM4.- Generates alternatives. Creates relevant options for addressing problems/opportunities and achieving desired outcomes • -ADM6. Chooses appropriate actions. Formulates clear decision criteria; evaluates options by considering implications; chooses an effective option. • ADM8.- Values diverse inputs/ perspectives. Embraces/ values diverse collection of inputs, values, perspectives, thought paradigms in approaching application of engineering and technology to products and processes.
Sample competency: oral communication in presentations • Definition: Clearly conveying information and ideas through a variety of media to individuals or groups in a manner that engages the audience and helps them understand and retain the message. • Key Actions • C1.- Organizes the communication. Clarifies purpose and importance; stresses major points; follows a logical sequence. • C2.- Maintains audience attention. Keeps the audience engaged through use of techniques such as analogies, illustrations, body language, and voice inflection. • C3.- Adjusts to the audience. Frames message in line with audience experience, background, and expectations; uses terms, examples, and analogies that are meaningful to the audience. • C5.- Adheres to accepted conventions. Uses syntax, pace, volume, diction, and mechanics appropriate to the media being used. • C6.- Comprehends communication from others. Attends to messages from others; correctly interprets messages and responds appropriately.
Why develop competencies for engineering programs? • A program can design competencies around specific knowledge and behaviors that “fit” an engineering discipline and graduate career paths. • Competencies do not address just the skills and knowledge of a content based educational program. Competencies, well defined, can embody the integration of skills and knowledge needed to become part of the engineering profession in a discipline. • “Transparent” to all participants: great match with employers’ goals and student and faculty understanding. • Provides a map and tools for students to achieve their goals in a meaningful way. • Easily measurable when assessing student performance in a program. • Can be aligned with engineering accreditation board required outcomes.
Assessment of Performance Demonstrations Acquired Skills, Abilities, and Knowledge Competencies ASSESSMENT Integrative Learning Experiences Developed in the Learning Process Skills, Abilities and Knowledge Learning Experiences Traits and Characteristics Entering Student Foundation Competency Educational Model From Voorhees et al., U.S. Dept. of Education document, 2001
Aligning competencies with engineering accreditation board outcomes
CDIO competencies and ABET outcome alignment • Conceive, Design, Implement, Operate • CDIO is also based on the concept of outcomes-based education. • The CDIO curriculum takes an additional step in supporting achievement of CDIO learning outcomes by precisely setting 12 curriculum standards: • CDIO context- sets the scene for curricular goals. The essence of engineering practice is conceiving, designing, implementing and operating complex valued added engineering products and systems. • CDIO syllabus outcomes- detailed outcomes for personal, interpersonal, product/system building consistent with program goals and validated by stakeholders. • Integrated curriculum with explicit plan to achieve CDIO outcomes • See handout of alignment of CDIO and required ABET learning outcomes.
CDIO Syllabus- There are several sublevels for each competency
Using “CEAB-style” program learning outcomes • While competency development can be very effective, a program could start this process from “CEAB-style” program learning outcomes. • These outcomes can be tailored to a specific program discipline, department educational themes (engineering science versus practice orientation) and graduate career paths. • For each high level program learning outcome, a program can then develop specific sub-levels of learning outcomes that describe graduate performance. • Such sub-levels of learning outcomes must be concrete statements of performance that can be assessed. • University of Pittsburgh EC 2000 Attributes Project has excellent list for programs to use in this process. • Uses B. Bloom’s Taxonomy of Educational Objectives and D. Krathwohl’s Taxonomy of Educational Objectives to develop levels of performance.
Bloom’s and Krathwohl’s Taxonomy of Educational Objectives (action verbs!) • Knowledge- remember previous information (define, describe, order) • Comprehension- grasp meaning (explain) • Application- apply to situations (compute) • Analysis- break down into simpler parts, see how parts are organized (diagram, experiment) • Synthesis- rearrange component ideas into new whole (construct, design) • Evaluation- make judgments based on internal evidence or external criteria (appraise, argue, choose) • Valuation- organize ideas when values conflict (challenge, defend)
University of Pittsburgh EC2000 Attributes Project: Sample Ability to apply knowledge of mathematics.
Competencies and curriculum design matrix • Developing a matrix that clearly shows coverage of behavioral competencies (not knowledge) in curriculum is useful. • Determine a curriculum design that logically develops student competencies in series of subjects. • The matrix shows the level at which competency is addressed in a subject. • T1=primary coverage in subject, T2=secondary coverage in subject • The matrix also shows whether the competency was introduced (I), reinforced (R), or used at an advanced level (Adv).
Subject design: example of “Introduction to engineering design” matrix • Design subjects that include precise activities that permit students to develop and demonstrate competencies. • Developing a matrix that shows the relationship between activities and program competencies is useful. • Instructors can develop their own subject level learning outcomes to improve subject design as well.
Effective lists of program competencies or program learning outcomes form the basis of nearly all program assessment methods. Statements of program learning outcomes can be used to design surveys, score sheets, etc. Rule of thumb for program assessment: Every learning outcome requires 3 measures in order to “triangulate” data. Can use 2 direct measures and 1 or 2 indirect measures. Measure types: Direct: expert examination or observation of student knowledge or skills. Indirect:: Student or alumni perceptions of extent or value of learning for a given learning outcome. Internal: Whether direct or indirect, measure of student learning is completed by program before student graduates. External: Whether direct or indirect, measure of student learning is completed by program or external bodies after student graduates. Designing and choosing assessment methods
Choosing an assessment measure and creating a program assessment plan • Choose a measure that fits the competency or program learning outcome. Eg. Behavioral observations for communication, teamwork. Eg. exams for technical skills and knowledge. • Arrange a schedule for gathering data. The program does not need to implement every measure every year. • Determine who will review the gathered assessment data and when (curriculum committee? Every term, once a year?) • Determine a simple method for reporting out to instructors and external accreditation bodies (website database). • Determine a process for addressing weaknesses identified by review of assessment data. Work with instructors to ensure this occurs in a timely manner.
Sample excerpt of a program assessment plan I=INDIRECT METHOD; D=DIRECT METHOD
Performance Criteria • Performance criterion is specific statement that describes measurable aspect of performance that is required to meet corresponding outcome. • For each type of assessment method, there are different performance criteria. • Senior Survey: To what extent did the program improve your ability to write short reports to update on technical project work. (Rating scale 1-7 where 7=greatly improved) • Senior response: 46% of seniors rate this ability as 5 or above • Performance criteria: 80% of seniors will rate their ability as 5 or above • Industry team performance rating of student ability to present engineering design results in a technical report. Scoring scale: 1= very poor, 3=adequate, 5=excellent • Industry team mean ratings: 4.3 • Performance criteria: Industry team mean rating will be 4 or above.
Performance Criteria as Scoring Rubrics • Scoring embedded exam, lab, or project work is more complex. • Developing a scoring rubric that raters can use to score work for select program learning outcomes is useful. • A scoring rubric describes specific features of student work being rated by level of performance.
Scoring Rubric Example: Ability to identify, formulate, and solve engineering problems • Level 5 performance characterized by: • Can relate theoretical concepts to practical problem solving • Can predict and defend problem outcomes • Takes new information and effectively integrates it with previous knowledge • Demonstrates understanding of how pieces of problem relate to each other • Formulates strategies for solving problems • Level 3 performance characterized by:…. • Level 1 performance characterized by: • Demonstrates solutions implementing simple applications of one formula or equation with close analogies to class/lecture problems • Does not see the connection between theory and practical problem solving • Is unable to predict or defend problem outcomes • Uses no resources to solve problems • Has no concept of how previous knowledge and new information relate • Does not realize when major components of the problem are missing • Has no coherent strategies for problem solving
Student portfolios • Students can be given the opportunity to develop portfolios of their work at each level in the program. • Students choose work that demonstrates achievement of each program competency in each year of the program. • Value in this method: students are aware of their progress in achieving each competency as they move through the program. • Randomly selected set of student portfolios can be scored for each program learning outcome using developed scoring rubrics.
Sample industry team score sheet for senior design projects (excerpt) * Incorporates program learning outcomes.
Many different ways to organized dataProgram learning outcome: ability to communicate effectively
What do we do with all this data? • MIT has developed a website to organize for program educational improvement: • Useful program and subject assessment tool templates. • Program assessment data organized by year.
DEGREE PROGRAM CURRICULUM AND DATA PAGE • Program and subject design • Program content and learning outcomes matrix • Program assessment methods and learning outcomes matrix • Key summative assessment measures of program learning outcomes • Program assessment data- organized longitudinally to show change • Indirect measures of learning • Senior surveys 1998-2009 • Alumni survey 2006 (Class ’99-’04) • Department Advisory Board Survey of Learning Outcomes, 2006 and 2009 • Employer survey 2003 and 2009 • Senior exit interviews or focus groups • Graduate school admission • Direct measures of learning • Senior oral examination • Fundamentals of Engineering (FE) exam results • Senior capstone review by industry/ alumni team • Senior thesis review • Embedded upper/ senior level exam questions in areas not covered by senior capstone
MATERIALS SCI/ENG SENIOR SURVEY ABILITY DEVELOPMENT, CLASS 1998-2007 ABILITY DEVELOPMENT SCALE 1-7 UNDERSTAND IMPACT OF ENG. IN SOCIETY WRITING ORAL REPORT ETHICS