160 likes | 281 Views
League and Ranking Tables and their Influence on Graduate Surveys Presentation to Victorian Statistical Officers (November 2004) by AJ Calderon Institutional Research http://www2.rmit.edu.au/departments/planning/ircu/ircu_home.php. In this presentation….
E N D
League and Ranking Tables and their Influence on Graduate Surveys Presentation to Victorian Statistical Officers (November 2004) by AJ Calderon Institutional Research http://www2.rmit.edu.au/departments/planning/ircu/ircu_home.php
In this presentation… • I focus on the publication of league and ranking tables and how these are influencing students, institutions and governments to ascertain the relative ranking of institutions. As institutions compete for students and resources, the importance of prestige and relative rankings are gaining more notoriety. • I then examine how graduate surveys such as the CEQ are being used to rank institutions and explore whether these are valid and meaningful comparisons.
Why interest in league and ranking tables? • Massification of higher education (increase in number of institutions and students) • Commodification of the higher education • Tighter competition for students and resources
Key Messages - League and ranking tables are here to stay! - Rankings and Course Guides are becoming more influential among students and parents - Critical issue is how such rankings might best be constructed: what sort of data should be included? - Governments are interested in increasing accountability via performance indicators (rankings is an effective way because it is cost neutral to governments at the expense of institutional compliance)
Evolution of University Rankings - Ranking and league tables started with “America’s Best Colleges” by U.S. News and World Report back in 1983. • In Australia, Dean Ashenden and Sandra Milligan first published the Good Guide in 1992 – now published by Hobsons Australia.
Evolution of University Rankings – contd II Elsewhere - The Times Higher Education Supplement has published the UK’s League Table of Universities since 1993. - The Sunday Times has published the UK’s University Guide since 1995. - Perspektywy (Poland) first published the Polish ranking of universities in 1992. - The Centre for Higher Education Development has collected and analysed the data for Germany’s ranking league of universities published in Stern since 1998.
Evolution of University Rankings – contd - Elsewhere • The Universities Guide in the Netherlands has been published since 1997. • Macleans magazine publishes the Ranking of Canadian Universities, based on graduate surveys and universities’ responses. • SwissUp publishes a ranking of Swiss Universities by discipline since 2001.
Evolution of University Rankings – contd - Asia • Asiaweek published “Asia’s Best Universities” (last edition 2000). • Asiaweek also published “Asia’s Best MBA Schools” and the “Science and Technology Schools Ranking”. • Asia-Inc published a MBA Survey of Business Schools in Asia (2001). • Australian Financial Review published a “Survey of Australia’s MBA Schools” in 2002.
Evolution of University Rankings – contd - Asia • Standard & Poors’ Financial Rating of [selected] Australian Universities (for a mere $40k fee). • The Australia and New Zealand Graduate Careers Survey 2004, sponsored by The Australian and NZ Herald newspapers. • Melbourne Institute Index of the International Standing of Australian Universities (Nov 2004), which combines survey data among CEOs of universities and data from the Shanghai Jiao Tong University Survey.
Evolution of University Rankings – contd – ‘Worldwide’ Perspectives • The Times Higher Education Supplement published a “World University Rankings” in November 2004 where it rated the World’s Top 200 Universities. • The Institute of Higher Education (Shanghai Jiao Tong University in China) has published an “Academic Ranking of World Universities”.
What do these surveys measure/methodology? • Topics/issues cover from measuring teaching quality, student recruitment, graduate destinations, publications, income, investment, staff and student surveys. • Overall ranking – self-assessment: CEO/respondent ranks universities other than own. • Methodology often changes from year to year, some data gathered from external agencies and government departments. • Data often weighted but often unclear how it’s done – sometimes ranking based on a single item/score or derived from a set of items. • Diversity of approaches: some are discipline based, course/program; others institution based.
A few observations… • Rankings exist on a number of activities across industries (eg financial services, health services and top companies). • Institutions at the top usually accept the rankings compared to those institutions at the bottom which question the validity and meaning of such rankings. • Institutional responses are often not co-ordinated, constructed on spurious data. • Lack of consistent definitions and often these are poorly constructed (e.g. Asiaweek’s survey: academic salaries and entry standards).
A few observations – contd - II Ranking based on Course Experience Questionnaire • Unreproducible results from the Good Guides: patchy methodology built over the years. • RMIT IRCU’s findings in analysing CEQ scales to mirror Good Guides. Ranking based on Graduate Destination Survey • Problems with interpretation of employment figures, standards and definitions. • Problems with salary data - again unreproducible results. • Unclear /unavailable methodology by the Good Guides.
Rankings and Graduate Surveys • We are facing survey fatigue (not only amongst graduates but at the institution level). • Prospective students are becoming more aware of rankings and rating surveys – promoted by media; extent to which these influence student choice is unknown. • Domestic students are less likely to pay attention to rankings than international but domestic students are more likely to respond to graduate surveys than international students.
Rankings and Graduate Surveys – cont II • The interconnection between rankings and graduate surveys is not clear. • Usage of survey data plays a significant role in determining rankings – often such use is flawed. • Government uses graduate surveys in an ‘ad hoc’ manner but yet adds pressure to institutions (increase accountability) – yet methodology is sometimes flawed.