140 likes | 274 Views
Assessment and Accreditation: Planning for a Compatible Relationship. Is it possible?. Answer: Only with planning!. Begin at the end: What does the accrediting agency require and what elements of your assessment initiative can meet those needs?
E N D
Assessment and Accreditation:Planning for a Compatible Relationship Is it possible?
Answer: Only with planning! • Begin at the end: What does the accrediting agency require and what elements of your assessment initiative can meet those needs? 1. Assessment data at all levels: institution, program/unit, and course/office – have a plan, and work your plan. 2. Documented use of the data for improvement. 3. An established cycle that ensures continued assessment activities. 4. Faculty and staff who are involved and committed to assessment.
Key elements • Employing measurable institutional goals at the macro level. Example (old): Promote an excellent learning environment to foster student success. Changed to (new): Increase students’ success in achieving their educational goals. Need: Baseline data, establish benchmark, gather data, review outcomes, determine improvements needed to assist students to better achieve their goals.
Develop linkages between institutional levels. • Demonstrate clear connections between institutional- level outcomes and all levels derived from the institutional level. • Connect Mission to course-level outcomes (see attached). • Connect Mission to office-level outcomes (see attached). Show that this has been done across all programs and offices.
Documenting Improvements: The most basic element in accreditation. • Don’t spend so much time collecting data that you have no time to analyze it and make improvements. • Prepare a report mechanism that allows you to collect data on how faculty/staff have used assessments for improvement. • Survey faculty/staff on their involvement in the assessment initiative so you can document changes needed in the way assessment is conducted. Key question:What changes have been made as a result of assessment? Are they documented?
External Surveys as Validation • Employ national surveys for validation of internal assessments. • CCSSE supported course embedded assessments that indicated students were challenged academically. • External surveys also provide opportunities to benchmark against peers. • SUNY SOS peer comparison data showed that Genesee ranked first or second in 45 areas.
Using Assessment Software • Employing software such as TracDat (www.Nuventive.com) to link all levels and to document historical changes in courses and programs. • Providing electronic submission of data and recording of changes for improvement. • Managing an assessment database. • Summarizing General Education benchmark attainment. Key: Show that you can handle a large volume of data and that the data provide you with information that leads to improvements.
Areas where work remains: • Document that results of outcomes assessment drive planning and resource allocation. • Identify multiple strategies for using assessment results. • Develop/use a glossary of standard terms regarding assessment. • Establish a pattern in which program improvement initiatives flow directly from outcomes assessments. • Continue ongoing institutional and academic assessment training.
Retrospective: What would I do differently? • Make sure that every level has independent measurable outcomes. • Develop a mechanism for faculty and staff to report changes/improvements based upon assessment data --- show clearly that assessment has mattered! • Concentrate on the linkages between assessment, planning, and resource allocation. • Fully develop electronic data management capabilities. • Establish comprehensive communication systems, including an assessment website.
So, yes, it is compatible but… • You have to establish the system that MAKES it compatible --- it is not automatic. • Accreditation focuses on the END -- you have to know the end, but you have to focus on the beginning or you won’t have an END! • Remember that it will take much more time than you believe it will to ensure the compatibility. • It requires the commitment of key administrators as well as faculty and staff. • You must communicate, communicate, communicate! Don’t be so involved in the details that you neglect the people!
Final points: Evidence, Evidence and more Evidence! • Employ multiple sources of data, both internal and external, both quantitative and qualitative. • Demonstrate that assessment is Mission-driven and that institution, program, and course/office assessment outcomes are directly linked. • Show how assessment outcomes are used for improvement. Document the improvements. • Also clearly document that assessment outcomes ‘drive’ planning and resource allocation. • Communicate to all constituents so that a ‘culture of assessment’ permeates the institution.
Questions??? • If you have questions or would like more information, please contact: Dr. Ruth Andes Assistant Dean for Assessment and Special Projects Genesee Community College Batavia, NY 14020 (585) 343-0055, Ext. 6308 reandes@genesee.edu
Sample Academic Assessment Report Cycle Mission: Genesee Community College commits to providing educational experiences which promote intellectual and social growth, workforce and economic development, and global citizenship. College Goal: Promote an excellent learning environment to foster student success. Institution Learning Outcome: Communicate effectively using appropriate written, verbal, and non-verbal skills. Program Learning Outcome: Employ proficient written and verbal communication skills, including the appropriate uses of technology. Course Learning Outcome: In HUS110, students will be able to write three case plans, identifying for each: relevant background information, three objective behavioral descriptions, a measurable objective, and two positive, empowering helper/teacher methods. Assessment Strategy: Using the second case plan, 70% of students met or exceeded the benchmark of 70%. While the benchmark was attained, data indicated that students had difficulty with two components of the case plan: behavioral descriptions and the behavioral objective. Remedy: More instructor feedback will be given during the designing of the first case plan. Students will work on drafts in class and write their behavioral descriptions and objectives in their daily class and intern journal. Assessment Results: In Spring 2003, 90% of students achieved the benchmark. In Fall 2004, 83% of students achieved the benchmark. In Spring 2005, 94% of students achieved the benchmark. In Fall 2005, 100% of students scored 80% or above. In a subsequent Assessment Impact survey, the Human Services program coordinator noted, Case reports for HUS students have undergone significant changes as far as my teaching and evaluation of student work. For example, a grading sheet was developed that makes it much clearer what students need to do to achieve maximum points on their case report. From the grading sheet, I redesigned the lesson plan for teaching students about case writing. That resulted in a more obvious connection between the learning in class and the end result (student case report and subsequent grade). I also added samples of student case reports that were done well. I continue to monitor assessment results to determine where I need to adjust my teaching to maximize student learning. Assessment has been the critical factor in effective teaching of case report writing.