500 likes | 585 Views
Using Accountability to Support Quality. Aligning Program, Campus, System, State, and Federal Models of Higher Education Effectiveness. Victor M. H. Borden, Ph.D. Associate Professor of Psychology Associate Vice Chancellor Information Management and Institutional Research
E N D
Using Accountability to Support Quality Aligning Program, Campus, System, State, and Federal Models of Higher Education Effectiveness Victor M. H. Borden, Ph.D. Associate Professor of Psychology Associate Vice Chancellor Information Management and Institutional Research Indiana University-Purdue University Indianapolis President, Association for Institutional Research
In Conclusion • Successful accountability systems are sensitive to diverse perspectives on quality • The best accountability measures derive from systematic assessment and evaluation • Alignment across levels is optimized by: • Recognizing and rewarding multiple kinds of quality • Active listening to multiple constituent groups • Responsible accountability educates all constituents
Part I What is a quality higher education?
A Pause for Reflection • What qualities of a college experience are most valued by… • Students? • Parents? • Faculty? • Trustees? • Legislators? • Accreditors? • Business and Industry?
Qualities of the College Experience • Students • Excellent teachers; access to rewarding careers/professions; academic support; personal enrichment; quality of student life; exposure to people • Parents • Personal and professional development; safety; access to positive role models; stay out of trouble • Faculty • Small classes; good students; space and resources (library, equipment, etc.) for research and scholarship; good colleagues; support for venture development; lack of bureaucracy • Trustees • Effective use of resources; faculty accomplishments (recognition and funding); student progress and achievement; contributions to economic development • Legislators • Access by constituents; employability of graduates; contributions to state welfare and development; effective use of resources; ties with business and industry; controlled costs • Accreditors • Adequate organizational structures and processes (administration and governance); appropriate credentials of faculty; effective use of resources; demonstrated student learning outcomes; quality assurance processes • Business and Industry • Preparation of graduates for workforce; support for research and development
+ Undifferentiated Differentiated + To State To Student Higher Education System Quality Program Quality + Cost Access
Models of Quality • Reputation and Resources • Ratings and rankings • Institutional Effectiveness • Quality assurance • Value-Added Approaches • Student learning outcomes • Student Engagement • Quality of academic experience • Higher Education Balanced Score Card • Measuring Up: The State Report Card
Reputation and Resources • Prevailing view? • Revolves around institutional archetypes • Research university; Liberal arts college; Experience of residential 18-22 year olds • Peer judgments included • academic reputation • Measures emphasize inputs and resources • Selectivity, faculty/student ratio, library volumes, research and salary dollars • Graduation rates as outcomes • Another reflection of selectivity and residential nature • Epitomized by U.S. News & World Report
Value-Added Approaches • More talked about than implemented • Focus on change from input to output • Student development • Regional improvement • Favors longitudinal measurement • Controversy about amount of change vs. reaching standards of excellence (or even adequacy) • Highlights key question: To what degree is institutional quality related to selectivity? • Can be incorporated as contextual element of other models
Institutional Effectiveness • Mechanistic approach • Ratio of resources to inputs, processes, and outcomes • Qualitative approach • Peer review of structures, credentials, and processes • “Audit” of quality assurance processes, including student outcomes assessment • Traditional accreditation approach
Student Engagement • Focuses on quality of college experience • Derived from “best practices” in undergraduate education • Wingspread; Chickering & Gamson • Basis of National Survey of Student Engagement (NSSE) • Fastest growing and most widely used survey in U.S. Higher Education History • Engagement Indices analyzed within context of student population characteristics
Principles of Good Practice • Encourages contact between students and faculty • Develops reciprocity and cooperation among students • Encourages active learning • Gives prompt feedback • Emphasizes time on task • Communicates high expectations • Respects diverse talents and ways of learning
HE Balanced Score Card • Kaplan & Norton (1976) propose business model • Financial performance • Customer service and satisfaction • Process effectiveness and efficiency • Organizational learning • Ruben (1999) adapts to higher education • http://www.odl.rutgers.edu/pdf/score.pdf
Ruben’s HE BSC • Teaching/Learning • Programs/Courses, Student Outcomes • Service/Outreach • University, profession, alumns, state, prospective students, families employers • Scholarship/Research • Productivity/Impact • Workplace satisfaction • Faculty/staff • Financial • Revenues/expenditures
Measuring Up • State-Level Focus • Preparation • H.S. grad rate; math/science course taking; math, science, reading, writing proficiency, college entrance exams; AP scores • Participation • H.S.college rate; young adult enrollment; working adult enrollment • Affordability • Family abil. to pay at CC; family ability to pay at 4-Yr; need-based financial aid; low-priced colleges; low student debt • Completion • Retention and grad rates at 2- and 4-yr. colleges • Benefits • Adults w/bachelor’s + degree; Increased income from BA degree; increased income from some col/2-yr degree; population voting; charitable contributions; quant literacy; prose literacy; document literacy
Measuring Up Student Learning • All states received an incomplete in the 2000 and 2002 rounds • Several position papers regarding approaches to assessing college-level learning (Peg Miller and Peter Ewell) • Several ongoing studies, e.g., RAND/CAE • Kentucky pilot, using licensing exams; GRE, MCAT, LSAT, etc; National Adult Literacy Survey; National Survey of Student Engagement • Move toward a college-level NAEP?
Part I Summary • Different people have different views of what defines the quality of higher education • Any attempt to create a comprehensive view will require • Multiple dimensions, multiple perspectives, and multiple measures • Balance between quantitative and qualitative components • But is a comprehensive view necessary, or can we take multiple, less complex views and then consider how well they align?
Part II How do we measure quality in higher education?
Common Methods • Peer Review of Self Study • Benchmarking • Performance Indicators
Peer Review of Self Study • How would you assess the quality of the history major at Harvard University as compared to at Miami-Dade College? • Self-study requires unit to take stock and reflect upon effectiveness in attaining goals, which requires • Having goals that relate to overall mission • Having capacity (infrastructure, resources, and processes) to effect goals • Monitoring effectiveness and adjusting
Peer Review of Self Study • Peer review provides expert and credible judgment sensitive to context • Manifestations • Faculty-led assessment • Academic program review • Specialized (professional and disciplinary) accreditation • Regional accreditation • U.K. External Examiners
Benchmarking Definition • Evaluation against an established standard • A process for evaluating business operations by detailed comparison with those of another business, in order to establish best practice and improve performance
Best Practice • A process, technique, or innovative use of technology, equipment or resources that has a proven record of success in providing significant improvement in cost, schedule, quality, performance, safety, environment, or other measurable factors which impact an organization • Context specific example – substance abuse • "best practices" are those strategies and programs which are deemed research-based by scientists and researchers at: • National Institute for Drug Abuse (NIDA), • Center for Substance Abuse Prevention (CSAP), • National Center for the Advancement of Prevention (NCAP), • Office of Juvenile Justice and Delinquency Prevention (OJJDP), and • Centers for Disease Control and Prevention (CDC).
Best Practice Stanley Fish in his Chronicle column …practices that had worked for some people in some context where some problem had been identified and was addressed successfully by some solution "Best Practices" is itself a practice, an industry focused on itself and equipped with its own internal machinery including a version of the Academy Awards that allows practitioners to recognize and honor one another publicly
Data and Measures • Comprise the metrics of benchmarking, but are not benchmarking itself • Often mistaken for outcome goals • U.S. News & World Report • Carnegie Classification • More correctly viewed as process indicators • Examples • Graduate program admissions selectivity and yield • Student credit hours per faculty FTE
Realities of Benchmarking • Often used outside process context • Measures seen as having meaning in and of themselves • How well do we do compared to others? • This is why ratings and ranking systems are inherently invalid • But measures can be starting point for process inquiry
Performance Indicators • History of external mandates • Recent efforts more closely related to internal evaluation and improvement efforts • BSC • Performance reports
Criteria for Effective PIs* • Start with purpose • Align throughout organization (or system) • Align across input, process, output • Coordinate a variety of methods • Use PIs in decision making *Banta, T. W. and Borden V. M. H. (1994). Performance indicators for accountability and improvement. In V. M. H. Borden & T. W. Banta (eds.) Using performance indicators to guide strategic decision making. New Directions for Institutional Research, 82. San Francisco: Jossey-Bass.
Framework Transparent Inclusive Auditable Content Complete Relevant Sustainable Quality & Reliability Accurate Neutral Comparable Access Clear Timely Principles for Effective PI Reporting* *Source: Global Reporting Initiative – Sustainability Reporting Guidelines
PIs as Measures • Inductive – Deductive Cycle
Measurement - Theory • Validity and reliability • Unless very careful attention is paid to one’s theoretical assumptions and conceptual apparatus, no array of statistical techniques will suffice – Blalock, 1982, p.9 • i.e., garbage in, garbage out • e.g., graduation rate, funding per FTE, etc.
APQC MIPO • An American Productivity and Quality Center (APQC) benchmarking study • Measuring Institutional Performance Outcomes (Higher Education)
Effective Performance Measures… • …communicate the institution’s core values • …are carefully chosen, reviewed frequently, and point to action to be taken on results • …may be stimulated by external requirements • …are best used as “problem detectors” to identify areas for management attention and further exploration • …are linked to resource allocation indirectly (non-punitively)
Effective Performance Measures… • …are publicly available, visible, and consistent across the organization • …are best considered in the context of a wider transformation of organizational culture • …take time to develop, require considerable “socialization” of the organization’s members, and are enhanced by stable leadership • …change the role of managers and the ways in which they manage
www.iport.iupui.edu E.G.: PIs@IUPUI
Planning & Budgeting 1. Mission, Vision, Goals developed 2. Unit goals aligned 3. Annual reports on web 4. Programs based on assessable goals, with performance indicators 5. Biennial planning/budgeting hearings conducted Improvement Implementation 1. Reporting to internal constituents 2. Demonstrating accountability to external stakeholders (Everyone on campus implements goals) 3. Applying findings in campus improvement initiatives 4. Proposing improvement initiatives 5. Improving assessment methods · Web-based data · Electronic portfolios Evaluation 1. Academic and administrative program reviews 2. Evaluation of process effectiveness 3. Assessment of learning outcomes • in major • in general education 4. Course evaluations 5. Student assessment 6. Constituent surveys 7. Management information and analysis 8. Program cost analysis 9. Web-based assessment tools 10. Annual campus performance report 11. NCA accreditation Assessable Outcomes Culture of Evidence Application of Findings Instrumentation Data Collection & Analysis
Part II Summary • Assessing quality requires conceptual and contextual frames • Measures of quality derive from and pertain to the assessment process • They do not have a life in and of themselves • Accountability for quality may have as much or more to do with demonstration of process than of particular outcomes
Part III How do we align approaches to quality assessment across levels?
Aligning Approaches to Quality Assessment Across Levels • How can programs, schools, colleges, universities, systems, states,…each define terms of quality that… • Are appropriate to local contexts and conditions? • Fit within higher level organizations goals and objectives? • What mechanisms promote fit among and between levels? • How does mission differentiation fit in?
Mission Differentiation v. Uniformity • Do current measurement models and accountability systems promote 4-year institutions to strive to be either doctoral extensive or elite liberal arts? • or promote 2-year institutions to become 4-year institution? • Do systems and states provide appropriate incentives for defining excellence in a variety of ways? • Is or can faculty culture be amenable to multiple dimensions of excellence?
Promoting Diversity of Excellence • U.S. News has a range of ranking categories • National, liberal arts, comprehensive, best value, business, engineering, programs that enhance learning • Institutions like Alverno, Truman State, Portland State, and (I’d like to think) IUPUI have ‘made the map’ on the basis of attention to student learning • These institutions are influencing their states’ conceptions and funding of higher education
Guidelines for Assessing HE Quality • Promote development of standards of excellence at each level • Direct accountability at each level for outcomes directly impacted by that level’s processes • Indirect accountability for sub-level outcomes through demonstration of quality assurance processes
Guidelines (2) • Collaborative review of how it all fits together • Once standards for excellence are negotiated and accepted, they should be accommodated in higher level goals and objectives • Make it possible that while no institution contributes as well to every outcome, every institution has a place to contribute significantly • Simplification, which is necessary for effective management, should be balanced by rich detail between the lines (or within the hyperlinks)
Guidelines (3) • Distinguish between outcome measures and measures that reflect contributions to outcomes • State-Level Measure: Access to Postsecondary Education • Contributing measures (and who is most likely to contribute best) • Geographic diversity across state (selective, residential universities) • Enrollment of low-income/place-bound students (regional colleges and universities) • Enrollment by members of underserved populations (minority serving institutions, urban universities) • Access by students with academic deficiencies (Community Colleges)
Concluding Questions • How does what we measure relate to what we hope to achieve? • How does how we measure contribute to goal attainment? • How do our goals and related measures align with • The goals and measures of component organizations and units? • The goals and desired outcomes of constituent users and stakeholders?