210 likes | 393 Views
Best Assessment Practices. ERAU Assessment Workshop for Faculty, October 2007. Assessment Role Best Practices. Deans/Dept. Chairs Ensure assessment is meaningful, robust, incorporates best practices Appoint program assessment coordinators
E N D
Best Assessment Practices ERAU Assessment Workshop for Faculty, October 2007
Assessment Role Best Practices • Deans/Dept. Chairs • Ensure assessment is meaningful, robust, incorporates best practices • Appoint program assessment coordinators • Opportunities for faculty to collaboratively reflect on assessment • Approve program-level assessment plans
Assessment Role Best Practices • Program Assessment Coordinators • Coordinate assessment / request resource support • Involve colleagues • Submit program assessment plans to Dept. Chair / Dean for approval
Assessment Role Best Practices • Other Faculty • Be aware of course contribution to program outcomes • Communicate program outcomes to students • Get involved in program assessment efforts
Program-level Assessment… • Is a holistic look at students’ learning experience IN THE PROGRAM • Goals: • Ensure and improve student learning • Transparency • Satisfy accreditation requirements • Takes the guesswork out of whether student learning has occurred or not • Assumption: passing all courses = students learned what they need to know… NOT NECESSARILY
Assessment at ERAU • ERAU uses “5-step” assessment process • Academic programs and administrative / support departments do assessment • Annual cycle begins in fall, ends fall of the following year • Close out current cycle by submitting Steps 4-5; ALSO launch new cycle with Steps 1-3 • Document online through ERPP (http://spa.erau.edu)
Areas of Progress • Departments / Colleges Planning Better Assessment • Validating Outcomes and Curriculum Alignment • Assessment Mini-Grants • Direct Assessment • External exams (ETS, FL Board of Prof. Engineers) • Evaluating student work w/ faculty juries • Rubrics • Department-created comprehensive exam • Assessing work in capstone courses • Identifying ‘indicator’ courses from which to draw assessment material and feed back results
Areas to Improve • Grades Used for Program Assessment • Over-abundance of Indirect Evidence • Senior exit interviews • Alumni survey results • Anemic Description of Methods, Results and Use of Results • Very Little Evidence of Improvements
5-Step Assessment Method • Step 1: Convey Mission Statement • Step 2: Define Outcomes • Discipline-Specific Student Learning Outcomes • General Education Learning Outcomes • Program Outcomes (job placement, grad. rates) • Step 3: Identify How to Measure Outcomes • Select Methods • Set Performance Criteria • Step 4: Interpret and Report Data • Step 5: Implement and Document Improvements
Step 1: Program Mission Statement • Does your mission statement convey program uniqueness? • Are you assessing what makes you unique?
Step 2: Develop Learning Outcomes • What should students know, think, be able to do? • Reflect any program accreditation requirements • Validated with external constituencies • Comprehensive • Communicated to students
Step 2: Develop Learning Outcomes • Examples: • “Graduates will have the ability to design and implement a computer system or component to meet desired needs.” • “Graduates will demonstrate the ability to write and formulate a technical report.”
Step 3: Methods and Performance Criteria • Where in students’ progression to assess? • Just before graduation • After graduation? (ABET), midpoint?, adequate pre-requisite knowledge? • Appropriate level (Bloom’s taxonomy, etc.) • Select methods • Multiple methods (triangulation) • Direct evidence • Set expected performance criteria – what is ‘good enough?’
Step 3: Methods and Performance Criteria • Outcome: Graduates will demonstrate the ability to write and formulate a technical report. • Method #1: Group Project Report • Criterion for Success #1: “At least 70% of students will receive a B or better in the report formulation section of the group projects administered in each of the following courses: SF 355 Industrial Hygiene and Toxicology, SF 365 Fire Protection… • Method #2: ERAU Alumni Survey • Criterion for Success #2: “At least 80% of alumni who respond to ERAU’s Alumni Survey will rate their degree-specific skills obtained at ERAU addressing this educational outcome to be “good” or better…”
Step 4: Data Interpretation / Reporting • Gather data • Share results among faculty and interpret collaboratively • Is performance acceptable? • Further inquiry / analysis? • Improvements to be made? • Good documentation
Step 5: Implement Improvements • Make improvements • Changes to course content • Changes to course sequencing • Pedagogical changes • Pre-requisites • Share improvements – with faculty and students • Document well • Mission-Critical Budget Request Form • Check to see if intended improvements worked in next assessment cycle
Step 5: Implement Improvements Poor Example: • “Pedagogical modifications” Good Example: BS Human Factors & Psychology (DB) • “We have changed our core curriculum to integrate a broader range of system modeling skills in the HFI - HFIV series. Specifically, HFIII was changed to incorporate more system skills that had been absent from the sequence. The old HFIII content (Ergonomics and Bioengineering) was given a new course number and is still required in our core curriculum.”
Next Steps • Assessment plans due to Dept. Chairs / Deans by end of November 2007 • Steps 4-5 to complete 2006-07 cycle • Steps 1-3 to launch 2007-08 cycle • Final plans reviewed and approved by Dept. Chairs / Deans by end of December 2007 • New Program Coordinators / Dept. Chairs / Deans – Set up Training for Plan Input and Approval with Tiffany Phagan