1 / 61

Enhancing Higher Education Programs through Institutional Research

This research focuses on the essential role of Institutional Research (IR) in making data-driven decisions to improve higher education programs. The Office of Institutional Research at Pratt Institute provides valuable information for evaluating and planning educational initiatives. By analyzing data from various sources such as student records, surveys, and external databases, IR supports teaching and development by identifying trends, assessing academic performance, and identifying areas for improvement. With a range of examples and case studies, this research demonstrates how IR can help educational institutions optimize programs, enhance student outcomes, and advance institutional objectives.

rebekahj
Download Presentation

Enhancing Higher Education Programs through Institutional Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. INSTITUTIONAL RESEARCH AND DECISION-MAKINGVladimir Briller, Ed.D.Executive Director of Strategic Planning and Institutional ResearchPratt Institute, New York, U.S.A. Higher School of Economics, Moscow October 18, 2012

  2. Institutional Research • Institutional Research is the practice whereby an institution assesses itself, its activities and its position within a given milieu. Higher Education Institutional Research offices conduct these assessments with the objective of serving as a comprehensive resource for information about the institution.

  3. Institutional Research at Pratt • The Office of Institutional Research (IR) at Pratt Institute is part of the President's Office. IR mission is to support data driven decision makingin evaluation and planning efforts of the Institute's senior administration by initiating and conducting studies on Pratt's policies, academic programs, and environment.

  4. IR office : • Gathers information from internal and external sources (e.g., students, parents, faculty, staff, other institutions, and external agencies) for assessment and strategic planning. • Provides information and projections needed for planning. • Coordinates Pratt's response to reports required by the federal government, including the IPEDS report, NYSED, NASAD, retention and graduation rate studies, etc.

  5. IR Office (continued): • Provides information required for certain institutional affiliations, such as accreditation reports, AICAD, and any special research projects in which Pratt Institute chooses to participate. • Responds to external information requests and surveys that are determined to be of value to Pratt Institute.

  6. Assignment • You have decided that a new program should be open or an ineffective one closed. What information will you request (and from who) to make an educated decision?

  7. Institutional Research • The data resources usually comprise information derived from surveys, student records and other internal record systems, sectoral and national databases and reports and published research. • The actual assessments, analyses and tested hypotheses cover issues requiring ongoing monitoring as well as the exploration of emerging issues to inform an institution’s decision-making with regard to its own development.

  8. IR Support of Teaching and Development (examples) • Grade ranges applied in particular subjects over time and correlation with changing characteristics in student cohorts with regard to prior achievement. • The impact of separate components (e.g. modules) on overall award classifications over time. • The effect of the size of continuous assessment components on overall grades awarded.

  9. IR Support of Teaching and Development (examples) • The entry standard below which students have a substantially increased risk of failure • The importance of mathematical ability in overall performance in Science and Engineering • Application, acceptance, registration and withdrawal figures for programs reflecting demand, perception and experience.

  10. IR Support of Teaching and Development Example of a faculty question: Failure rates have risen dramatically in one of my courses, but I have not changed my methods and I can’t see why this has happened. Possible IR-based explanations: • Changes in entry requirements • Changes in actual pre-entry educational achevement of the cohort.

  11. IR Support of Teaching and Development • Achievement in core pre-entry subjects such as English or Mathematics • Changes in class size • Changes in origins of class (are all students in the class native English speakers?) • Gender, Age, Educational and socioeconomic characteristics, and attendance type profiles • Range of grades used over time in assessing the course

  12. IR Support of Teaching and Development Example of a Dean/Department Chair Question: Student retention in my program is poor, I understand some of the reasons why but I want to address the problem and need a comprehensive picture of what is happening. Possible IR-based actions: Analyze: • Student profile now, how it has changed and how it is likely to change in the future • Particular program elements are contributing most consistently to non-completion

  13. IR Support of Teaching and Development Analyze: • The students’ perception of the program and overall college experience • Whether student expectations of the program were realistic prior to entry

  14. IR Support of Teaching and Development Analyze: • Whether entry requirements need to be recalibrated based on changes in standards or curricula outside the Institution • Whether a change in program content and providing extra support in problem areas, would help students to progress.

  15. Case Study: Attitudes, experiences and characteristics influencing student degree completion • Focus on Freshmen • The project uses information from student database and information derived from a series of three student surveys. • The surveys track changing attitudes as well as academic progress through the first year. • The study also eliminates factors that do not actually have any significant effect on student achievement

  16. Study Goals • Explore a wide range of aspects of the experience of undergraduate students with the specific purpose of identifying factors that may influence program completion. • Identify the factors and relationships determining the qualitative nature of the student experience. • Explore the relationship between pre-entry expectations and reality of the university experience. • Identify factors affecting student retention, with a view to focusing efforts and resources on the most potent influencing factors.

  17. First Survey: Point of Admission • Demographics • Self evaluation of personal characteristics; including persistence, mathematical and writing ability, ambition, academic ability and self-confidence, • Factors affecting the decision to study at University, • Level of prior understanding of the program, • Anticipated time spent on specified work, study and social activities, • Difficulties anticipated,

  18. First Survey: Point of Admission • Perceived locus of responsibility for learning and the role of the lecturer, • Priorities while at University, academic ambitions and career goals, • Family educational background, • Financial concerns, • Perception of the experience of studying at higher education level in practical terms, and • The anticipated best and worst elements of the experience of study at University.

  19. Second Survey – Mid-year Compares student responses with the first survey • Self evaluation of characteristics, • Level of prior understanding of the program, • Actual time spent on specific activities, • Difficulties encountered, • Perceived locus of responsibility for learning and the role of the lecturer, • Priorities while at University, academic ambitions and career goals,

  20. Second Survey – Mid-year • Financial concerns, • The best and worst elements of the experience thus far, • Self-identified changes in perception of study at higher education level having spent one semester in the University, • Support services accessed, and • Integration into campus life/sense of belonging.

  21. Third Survey – End-of-Year • Academic history including high school results and SAT (ACT) scores, and level of preference for the institution where the participants were accepted, • Exam results achieved through the year, including continuous assessment grades, • End-of year results, • Other official items of record including withdrawal and reasons for withdrawal, changes in optional program elements and transfer, and • Completion rates at the institutional and program level.

  22. Other Cases/Studies • Factors affecting student retention and graduation • Barrier Courses • Placement tests and their impact on subsequent student course performance • SAT scores as predictors of student persistence

  23. Enrollment Management

  24. STUDENT RECRUITMENT The Educational Pipeline • Understanding Student Choice: ~ Marketing studies that determine what factors influence students to apply, become admitted, and enroll at the institution. ~ Identifying databases and software analyses tools that facilitate institution’s ability to locate, recruit and attract students in the pipeline.

  25. STUDENT RECRUITMENT ~ Generate a trend analysis that compares characteristics of this year’s applicants with applicants from previous years at the same point in time. ~ Compare admitted studentswho ultimately chose to enroll with those who did not. ~ Provide institutional data to college ranking services. ~ Provide data about student and parent perception of the institutional image as compared with peers.

  26. STUDENT RECRUITMENT • Yield rates: ~ Admit yield rates ~ Enrollment yield rates • Enrollment projections: • What are the needs of institution? • What are the dimensions of the analysis? • What is the time horizon? • What methodology should be used? • How should qualitative and quantitative input be balanced?

  27. STUDENT RECRUITMENT • Financial Aid (FA) • What is the college’s FA policy? Who determines the policy? How well integrated are the admissions and FA policies? • What types of aid are available? How do students qualify? • How is FA packaged? How and when are students offered aid? How is it disbursed? • How are scholarships, loans and student employment balanced? • How are recruitment and retention functions of aid balanced?

  28. STUDENT RECRUITMENT • What are statistics reported by the FA office? (examples) ~ How many students receive aid? New students? Continuing students? ~ How many receive scholarships? Loans? Work-study awards? ~ How many receive need-based aid? How many show unmet need? ~ How much FA is disbursed? What is the net tuition revenue? ~ What is the price of attendance?

  29. STUDENT RECRUITMENT ~ What is the level of student indebtedness? ~ How those statistics vary by student demographics and other characteristics? ~ What are the trends over time? National Concerns: • The interplay between FA, tuition and college price overall. • The impact of federal and state policies on FA. • Use of “discounting” for effective recruiting. • The rapid escalation of student loans and indebtedness.

  30. STUDENT FLOW • Academic Preparation • Selecting Students • Student Placement • Other Academic Assets • The Curriculum • Types of Studies • Campus Climate • Academic and Student Support Programs • Formative (Process) Evaluation • Summative (Outcome) Evaluation

  31. STUDENT FLOW • Graduation and Retention Rates • Increasing the institution’s retention and graduation rates • Increasing transfer rates (in) and baccalaureate degree completion of associate degree students • Reducing time to graduation • Closing the gap between underrepresented groups and other students • Increasing academic preparation – the link between recruitment & retention • Implementing & evaluating efficient & effective retention programs.

  32. STUDENT FLOW • Descriptive Data • Multivariate Analyses • Qualitative Methods • Survey Research • Interviews • Focus Groups • Peer Data

  33. Student Flow • Beyond Graduation: • The overall quality and training of an institution’s graduates/students. • The preparation of graduates in specific areas: writing skills, technical skills, quantitative resoning, oral communication, leadership & teamwork. • The accessibility of the campus, and its students, to the employer for interviewing. • Trends in past hiring and expectations for the future.

  34. Assessment • Operational Terms • Drivers of assessment • Assessment of institutional effectiveness • Assessment of student learning outcomes • Blending assessments • Benefits and cautions • Questions and concerns

  35. Assignment • Please list all the factors you use to evaluate students / faculty / administrators.

  36. Institutional Research and Assessment Assessment is the process of asking and answering questions that seek to align our stated intentions with documentable realities. As such, in higher education, it deals with courses, programs, policies, procedures, and operations.

  37. Evaluation: An Operational Definition • Evaluation focuses on individual performance in the sense of job completion and quality, typically resulting in merit raises, plans for future improvement, or—in less satisfying cases—probation and possibly firing.

  38. Assessment vs. Evaluation • Assessment focuses on the work to be done, the outcomes, and the impact on others—not on the individuals doing the work. • Evaluation focuses on the work of the individuals—their contributions, effectiveness, creativity, responsibility, engagement, or whatever factors the organization deems most desirable.

  39. Assessment of Institutional Effectiveness vs. Student Learning • Institutional effectiveness = the results of operational processes, policies, duties and sites—and their success in working together—to support the management of the academy • Student learning = the results of curricular and co-curricular experiences designed to provide students with knowledge and skills

  40. What or who is driving assessment? • Accreditors… • charged with determining the reputable from non-reputable institutions and programs • charged with checking on practices that affect the viability and sustainability of the institution and its offerings • represent disciplinary and institutional interests

  41. Assessment drivers (cont’d.) • The public: “Ivory Tower,” liberal bias, ratings/rankings • Legislators: responsive to citizens’ concerns about quality, costs, biases….or? • Prospective faculty: Quality and meaningful contributions to students’ lives • Prospective parents: real learning and preparation for careers • Prospective students: How will I measure up? And what kind of job can I get when I graduate? • Funding agencies/foundations: evidence of commitment to learning and knowledge and evidence of [prior] success

  42. Higher Education Realities • Competitive nature of higher education • National rankings • Institutional research and data • Marketing • Niche markets • Tuition Costs • Consumer attitudes of students: learning outcomes and institutional effectiveness

  43. Matters of Institutional Quality • Can we justify costs/prices of attendance? • Can we verify the quality of our educational offerings in measurable terms? • Can we verify the effectiveness of operational contributors to a sustainable educational experience? • Can we use data and other findings to improve the quality of our educational and operational offerings? • Can we use those findings to align resources (financial, staff, curricular, co-curricular) to enhance desired outcomes?

  44. Sites of Institutional Effectiveness • Processes [existence and transparency] • Enrollment: Admissions, financial aid, registration • Curricular: Advising, progress toward degree completion • Budgeting: operations/salaries; capital; bond ratings and ratios; endowment management; benefits; etc. • Planning: strategic planning, compact planning, curricular planning, etc. • Judicial: education/training, communication, sanctions, etc. • Residence Life: housing selection, training for RAs, conflict resolution/mediation,

  45. Sites of Institutional Effectiveness • Units/Offices of operations • Advancement • Admissions, Bursar, Registrar • Center for Advising, Academic Support, etc. • Campus Safety • Maintenance • IT • Institutional Research • Athletics • Student Engagement

  46. The Assessment Cycle: Key Questions for Institutional Effectiveness • What services, programs, or benefits should our offices provide? • For what purposes or with what intended results? • What evidence do we have that they provide these outcomes? • How can we use information to improve or celebrate successes? • Do the improvements we make work?

  47. What are we looking for? EXAMPLES of evidence: • Our admission of students for whom our institution is the first choice has risen 20%. • 90% of students report satisfaction with the housing selection process. • Four faculty and two student committees participated in the last strategic planning planning cycle. • Overall, faculty, staff, and students report feeling safe on campus, following the new Campus Safety Improvement initiatives.

  48. Where do we seek improvement [and what evidence will help us]? • We need to raise the number of students who choose our institution as their first choice to 50% by 2017. • All faculty committees will be invited to participate in the next planning cycle. • Students (95%) will report feeling safe on campus and its neighborhood. • 50 percent of the credit-hours will be taught by the full-time faculty.

  49. What qualities point to institutional effectiveness? • A well-articulated set of processes for critical functions • A clear line of responsibility and accountability for critical functions • An alignment of the importance of the function and sufficient resources (staff, budget, training, etc.) to support the function • Evidence of institution-wide knowledge of those critical functions, processes, and lines of responsibility

  50. What kinds of evidence points to institutional effectiveness? • Well-managed budgets • Accreditation and governmental compliance • Clearly defined and supported shared governance (board, president, administration, faculty, staff, and students) • Communication pathways and strategies [transparency] • Consensus on mission, strategic plan, goals, priorities, etc. • Student (and other constituencies’) satisfaction

More Related