350 likes | 491 Views
IE Presentation, November 20, 2008. Institutional Effectiveness at the University of North Alabama. Dr. Andrew L. Luna Institutional Research, Planning, and Assessment. Connections to IE?. Telephones and IE?. Walter Shewhart. Edwards Deming. Joseph Juran.
E N D
IE Presentation, November 20, 2008 Institutional Effectiveness at the University of North Alabama Dr. Andrew L. Luna Institutional Research, Planning, and Assessment
Telephones and IE? Walter Shewhart Edwards Deming Joseph Juran Hawthorne Works, Bell Laboratories
FACT... The Shewhart Cycle is the foundation for all quality and Continuous improvement processes that we use today Shewhart Cycle Plan Continuous Improvement Act Do Check
Points of Discussion • Similarities between the Shewhart Cycle and Institutional Effectiveness • Overview of Institutional Effectiveness at UNA • Review of Outcomes and Improvement Processes • Review of Assessment • Questions
More on the Shewhart Cycle • Plan – Create a strategy as to what you want to do and how you will measure success • Do – Follow the plan and do what you say you will do • Check – Assess the effectiveness of the current plan by looking at the success outcomes measures • Act – Make changes to the strategies to improve the measured outcomes • Repeat the Cycle!
Why is the Shewhart Cycle Important? • If you can’t measure something, you can’t understand it… • If you can’t understand it, you can’t control it… • If you can’t control it, you can’t improve it… • If you can’t improve it…then why the heck are you doing it?
So, What is Institutional Effectiveness? • A sharpened statement of institutional mission and objectives • Identification of intended departmental/programmatic outcomes or results (Plan) • Establishment of effective means of assessing the accomplishments outcomes and results (Do, Act, Check) FACT... Institutional Effectiveness is primarily undertaken to improve what we do…not just to pass accreditation.
Shewhart Cycle and SACSMacro IE Check and Act Plan Core Requirement 2.5: “The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that (1) incorporate a systematic review of institutional mission, goals, and outcomes; (2) result in continuing improvement in institutional quality; (3) demonstrate the institution is effectively accomplishing its mission.” Do
Key Points to Core Requirement 2.5 • Emphasizes an expectation that the institution is the primary focal point for compliance • Sets expectations for the description of planning and evaluation processes that are active and continuous rather than static or single occurrences. • Points to a clear and strong expectation for documentation of the systematic review of institutional mission, goals and accomplishments consistent with its mission • Sets expectations for the documented use of results of institutional planning and evaluation to achieve institutional improvements
Shewhart and SACS, Cont.Micro IE Plan Comprehensive Standard 3.3.1: “The institution identifies expected outcomes for its education programs … and its administrative and educational support services; assesses whether it achieves those outcomes; and provides evidence of improvement based on analysis of those results.” Check Do and Act
Key Points to Comprehensive Standard 3.3.1 • Emphasizes the unit level of individual educational programs and support services • The expected achievements of educational programs and support services should be articulated, and evidence presented concerning accomplishments • Distinguishes between program outcomes and learning outcomes • Sets expectations that improvement is guided by the establishment and evaluation of program and learning outcomes
Shewhart and SACS, Cont.General Education and IE Plan Comprehensive Standards 3.5.1 “The institution identifies college-level competencies within the general education core and provides evidence that graduates have attained those competencies.” Do, Check, Act
Key Points to Comprehensive Standard 3.5.1 • General Education should be part of the institutional mission • The expected achievements of the General Education program should be articulated, and evidence presented concerning accomplishments • Improvement should be guided by the establishment and evaluation of learning outcomes
Overview of Institutional Effectiveness Focus on Assessment • Comprehensive Dept./Program Review • Program Outcomes • Quality Indicators • Productivity Indicators • Viability Indicators • Evaluation of Learning • Learning Outcomes • What graduates know • What graduates can do • What attitudes/values graduates possess Mission Strategic Goals Continuous Improvement of Student Learning Continuous Improvement of Programs and Departments Institutional Effectiveness
Institutional Effectiveness System at UNA • Annual Report - Annual Action Plan and Assessment Report • Comprehensive Program and Department review – Five-year Review • Review of General Education – Five-year cycle of General Education assessment
Schematic of Institutional Effectiveness Process Five-Year Review for Selected Depts. Five-Year Review for Selected Depts. Five-Year Review for Selected Depts. Five-Year Review for Selected Depts. Five-Year Review for Selected Depts. Year 2 Annual Reports Year 1 Annual Reports Year 3 Annual Reports Year 5 Annual Reports Year 4 Annual Reports Area III Five-Year Assessment Area II Five-Year Assessment Area IV Five-Year Assessment Overall Gen. Ed. Assessment Area I Five-Year Assessment No • OIRPA • IE Committee • Gen. Ed. Committee 5-Year Cycle? Yes Review of Strategic Goals
Five-Year Program/Department Review Timeline (pending IE Committee Approval) Last year’s Depts. that underwent 5-year review submits outcomes of review as AAPAR priority initiatives OIRPA submits Five-Year Enrollment report to academic departments OIRPA meets with Deans/VP for overview Deans/VPs meet with departments to discuss review OIRPA conducts assessment workshop for UNA campus September October November December January February March April May June July August September OIRPA initiates individual departments meetings OIRPA meets with departments up for review Five-Year Reviews completed and sent to Dean/VP OIRPA submits overview of Five-Year process to IE Committee
Annual Action Plan and Assessment Report Timeline (pending IE Committee Approval) OIRPA submits AAPAR overview to IE Committee Budget initiatives based on Priority Initiatives are established President, VP, and Dean Initiatives Due Next FY Priority Initiatives by VPs September October November December January February March April May June July August September 1st part of AAPAR due for current fiscal year w/ one Priority Initiative for next FY 2nd Part of AAPAR completed by depts. Next FY Priority Initiatives by Deans SPBS reviews Next FY Priority Initiatives
Outcomes • Operational Outcomes - measures of how well the institution/division/department is meeting/exceeding requirements • Learning Outcomes - statements of the knowledge, skills, and abilities the individual student possesses and can demonstrate upon completion of a learning experience or sequence of learning experiences (e.g., course, program, degree).
Problems with Outcomes • Outcomes are too broad • Outcomes do not address core requirements/competencies or mission • Outcomes are not measurable
Types of Measurement • Discrete or Attributes data • Binary data with only two values • Continuous or Variable data • Information that can be measured on a continuum or scale • Yes/No • Good Bad • On/Off • Male/Female • Pass/Fail • Height/Weight • Temperature • Test Scores • Time • Distance
Forms of Measurement Longitudinal data is gathered over an extended period , , … Semester1 Semester2 Semester3 Semestert
Forms of Measurement, Cont. Cross-sectional data represent a snapshot of one point in time
What is Improvement? • Measurable actions that increase learning, efficiency, effectiveness, and/or the bottom line • Decrease the Bad • Increase the Good • Decrease Variability
Decrease Variability? What the heck is that? Class A Class B 100, 100 99, 98 88, 77 72, 68 67, 52 43, 42 91, 85 81, 79 78, 77 73, 75 72, 70 65, 60 Mean Mean = 75.5 Mean = 75.5 STD = 21.93 STD = 8.42
Inputs, Processes, and Outcomes Measurement Materials Methods X’s X’s X’s X’s Y’s Input Outcomes X’s X’s X’s Environment People Machines
Assessment Steps • Develop learning/operational objectives • Check for alignment between the curriculum/business process and the objectives • Develop an assessment plan • Collect assessment data • Use results to improve programs/department • Routinely examine the assessment process and correct, as needed
Types of Assessment – DirectAcademic • Published Tests • Locally Developed Tests • Embedded Assignments and Course Activities • Competence Interviews • Portfolios
Types of Assessment – DirectEducational Support/Administrative • People enrolled/participating/served • Work accomplished • Revenue generated • Turnaround time • Decrease in nonconformities
Types of Assessment - Indirect • Surveys • Interviews • Focus Groups • Reflective Essays
How Can OIRPA Assist? • Create university wide reports – Five-year departmental reports • Analyze university-wide assessment data – NSSE, CAAP • Hold workshops on assessment and IE • Work with individual departments on annual reports, program review, and outcomes assessment • Provide ad hoc data reports for departments • Work with committees to develop assessment plans – IE Committee, Gen. Ed. Committee