130 likes | 320 Views
University of Cincinnati Program Review (E-Review). Comprehensive Program Review Academic Coordinating Committee Progress to Date Andrea Lindell, Co-Chair Kristi Nelson, Co-Chair January 2008. Comprehensive Program Review Purpose. Evaluate programs against criteria
E N D
University of Cincinnati Program Review (E-Review)
Comprehensive Program ReviewAcademic Coordinating CommitteeProgress to DateAndrea Lindell, Co-ChairKristi Nelson, Co-ChairJanuary 2008
Comprehensive Program ReviewPurpose • Evaluate programs against criteria • Gather data about Assessment and Evaluation • Prepare for alignment with the USO • Gather information to assist with preparations for the HLC/NCA visit • Prepare for the implementation of PBB
Timeline Update • Phase I (Spring 2007 – Winter 2008) • Review list of programs with Registrar and College Deans to ensure consistency • Identify potential problems that create discrepancies and make corrections • Clarify PASLA nomenclature across colleges • Phase II • December 2007/January 2008 • Established Review Criteria • Developed E-Review Prototype • Developed Process for Review • Shared with AOC, Associate Deans Council, Cabinet
Timeline Update (cont’d) • Phase II (cont) • January –February 2008 • Continue development of E-Review program • Test Program • February –March 2008 • Finalize and extract list of programs for review • Test Interface for IR data • Beta Testing of E-Review program • Migration of database to Provost Office servers • Test run from Provost Office servers
Timeline Update (cont’d) • Phase II (cont) • March 2008 • E-Review goes live for data input by colleges • April 2008 • ACC conducts a test review of college submissions, focusing on Graduate Programs • May – August 2008 • ACC reviews Program data • Creates Sub-committees as necessary
Program Review Criteria • Accreditation/Graduate/Undergraduate Review /No Specialized Review • When • By whom • Outcomes of review • Response to review recommendations • Next review date • IR Data • Enrollment history (5 years) • Graduation rates • Retention data • Instructional FTE • Faculty Course Summary
Program Review Criteria(cont’d) • # of tracks within program/major (if applicable) • Brief Program Description • Mission Statement and Alignment with College/University Mission/UC|21 Academic Priorities • Alignment with Fingerhut’s USO (placeholder at present) • Impact of market / competitive needs for program and/or impact of program on market • Budget/Resources (Placeholder at present)
Program Review Criteria(cont’d) • Features of the Program
Program Review Criteria(cont’d) • Features of Program, con’t • Study Abroad • Service Learning domestic • Service Learning international • Problem- or case-based instruction • Instructional Technology (text box) • Distance Learning • Library Research Component • Significant student writing (at least a four page paper) • Student presentations (in-class) • Student presentations (out of class including professional meetings) • Corporate collaborative • Interdisciplinary connections or perspective • Students are members of professional organization • Other
Program Review Criteria(cont’d) • Assessment Practices - Standardized test (list) - Portfolio assessment - e-portfolio assessment - Pre-/ post testing - Focus groups - Professional certification –student pass rates - General Education assessment - Skill proficiency testing (list) - Internal Review - External Review - Other
Program Review Criteria(cont’d) • Best Practices for Tracking Graduates • Exit Interviews • Focus Groups • Graduating Student Surveys • Employer Surveys • Electronic Communications (New Technologies): i.e. InCircle, Facebook groups, Linkedin • Tell us about your opportunities through newsletter or e-newsletters • Other
Program Review Criteria(cont’d) • If the program were to be closed, list the impact on the university and community: 1) 2) 3) And If the program were to be closed and you could retain the resources, list three ways you would redirect the resources: 1) 2) 3)