160 likes | 187 Views
ABET Assessing Program Outcomes. Amir Rezaei. Outline. Context of Assessment Process of Assessment Similarities and differences between classroom and program assessment Developing measurable outcomes Developing scoring rubrics Assessment methods Direct measures/Indirect measures
E N D
ABETAssessing Program Outcomes Amir Rezaei
Outline • Context of Assessment • Process of Assessment • Similarities and differences between classroom and program assessment • Developing measurable outcomes • Developing scoring rubrics • Assessment methods • Direct measures/Indirect measures • Data collections • Closing Loops • Writing the report
Assessment of Outputs provides or direct measurement of the effectiveness Assessment of inputs and process only establishes the capability or capacity of program Assessment of Outputs serve as indirect measures for effectiveness What are we doing with the inputs? What comes into the system? What is the effect? How many? Context of Assessment
Collect data when they graduate Process of Assessment • Can we demonstrate that student have learned outcomes xx to an appropriate level by the time of graduation? • Can we demonstrate that we have added value to student learning of outcome xx to an appropriate level by the time of graduation? Pre and post data collection
ABET uses these terms, Use same language to reduce the confusion
Similarities and differences between classroom and program assessment • Degree of complexity • Time span • Accountability for the assessment process • Cost • Level of faculty buy-in • Level of precision of the measure
Timeline 1 quarter Classroom Assessment Context Subject matter Faculty member Pedagogy Student Facility Concepts Forces in 2d & 3D Moment f a force about.. Eq in 2D & 3D FBD Trusses, Frames and Machines Friction Topics Statics of particles Equivalent system of forces Eq. Rigid bodies Equilibrium Structures Friction Subject Statics Assessment Focus: Evaluate individual studentperformance (grades)Evaluate teaching/learning
Environmental Factor InstitutionalContext Out-of-class Experiences Co-curricular; Co-ops; internship Coursework& Curricular Patterns Classes chosen; major EducationalObjective Timeline xx years Program Assessment Student Pre-college Traits Classroom Experience Pedagogy; Facility; Faculty & student Characteristics
Developing measurable outcomes Performance Criteria Outcomes Researches and gathers information Fulfill duties of team roles Shares work equally Listen to other teammates Objetive Ability to function on multi-disciplinary team Work Effectively with other Makes contributions Takes responsibility Value other viewpoints
Developing scoring rubrics • A rubric is a set of categories which define and describe the important components of the work being completed, critiqued, or assessed • Purposes • Information to/about individual student competence (Analytic) • Overall examination of the status of the performance of a group of students (Holistic)
Developing scoring rubrics • Generic • Big picture approach • Element of subjectivity • Task-specific • Single task • Focused approach • Less subjective • You don’t have to develop a rubric for every outcomes -Note Rubric template
Assessment methods No rubric Rubric Rubric No rubric No rubric No rubric Rubric Rubric Rubric Rubric Rubric Rubric • Written surveys and questionnaires • Exit and other interviews • Commercial, standardized exams • Locally developed exam • Archival records • Focus group • Portfolios • Simulations • Performance Appraisal • External examiner • Oral exam • Behavioral observations
Direct and Indirect Measures • Direct measures: • Direct measures provide for the direct examination or observations of student knowledge or skills against measurable learning outcomes • Indirect Measures: • Indirect measures are those that ascertain the opinion of self-report of the extent or value of learning experiences
Direct Indirect • Exit and other interviews • Standardized exams • Locally developed exams • Portfolios • Simulations • Performance Appraisal • External examiner • Oral exam • Behavioral observation • Written surveys and questionnaires • Exit and other interviews • Archival records • Focus groups
Sampling And Data collection • For program assessment, sampling is acceptable and desirable for programs of sufficient size Year 1 Year 2 Year 3 Year 4 Define Outcomes/ Map curr. Data Collection Evaluation & Design of implementation Implement Improvements & Data collec.
Closing the Loop Evaluation committee receive and evaluate all data; makes report and refers recommendations to appropriate areas Institute acts on the recommendations of the Eval. Comm. Report of actions taken by the Institute and the targeted areas are returned to the Eval Comm. For iterative evaluation Institute assessment comm. Prepares reports for submission to dept. Heads of the collected data (e.g. survey, e-portfolio rating)