350 likes | 515 Views
The Convergence of University Rankings and System Benchmarking. An Apparent Paradox of “ Rankology ”. Questions. Two approaches : University Rankings System Benchmarking Are they: Complementary ? Competing? Consistent?. Outline. (1) Background: from ranking to benchmarking
E N D
The Convergence of University Rankings and System Benchmarking An Apparent Paradox of “Rankology”
Questions Two approaches: University Rankings System Benchmarking Are they: Complementary? Competing? Consistent? IREG - Warsaw, 16-17 May 2013
Outline (1) Background: from ranking to benchmarking (2) Method of investigation • Results (4) Interpretation and conclusion IREG - Warsaw, 16-17 May 2013
(1) University Rankings IREG - Warsaw, 16-17 May 2013
U Rankings: a Polarizing Exercise U Rankings: hated/loved, criticized/commended, threatening/stimulating but proliferating (“here to stay”) Ph. Albatch’s advice [“Don’t take too much notice of rankings” (UWN, March 23, 2013)]: unlikely to be widely followed More pitfalls discovered, uncovered, elucidated more attempts to improve methods IREG - Warsaw, 16-17 May 2013
U Rankings: the Disease Methodological caveats Biases: Research, English, STEM Composite indicators: Weighting => Elitism Subjective (reputation) /non transparent Dangerous use (“misuses”, “abuses”) Universities: (1) Focus on competition with others instead of own improvement / Affect strategic planning (2) Focus on biased criteria (research) Policy makers: Focus on a few WCUs instead of whole system Students: Impact on university selection Overall: Impact on financing Commercialization (crowded) market IREG - Warsaw, 16-17 May 2013
From Ranking to Benchmarking “If Ranking is the Disease, Is Benchmarking the Cure?” (JamilSalmi, SunitaKosaraju. Evaluation in Higher Education, Vol. 5 no.1, June 2011) “Rankings: Neither a Disease nor a Cure” (Ph. Albatch, UWN, 2013) IREG - Warsaw, 16-17 May 2013
(2) System Benchmarking Governance Resources TE SYSTEM Access Quality control Private Providers Equity Economic, Social & Technological Environment IREG - Warsaw, 16-17 May 2013
Benchmarking: Objective & Criteria Objective: assess strength, health and performance of countries' tertiary education systems Criteria: resources, inputs, governance, outputs and outcomes of the system (access, equity, quality, relevance) IREG - Warsaw, 16-17 May 2013
Benchmarking: Main Initiatives • SABER: System Approach for Better Education Results (World Bank) Still under construction • U21 (Universitas 21/ University of Melbourne)Most recent, comprehensive available case See below • Benchmarking University Governance (World Bank – MENA): Hybrid • AHELO: Assessment of Higher Education Learning Outcomes (OECD) Still under experimentation IREG - Warsaw, 16-17 May 2013
Hypothesis Benchmarking developed in reaction to Rankings Objectives, level of observation and criteria of Benchmarking and Ranking are quite different ==Shouldn’t they yield different results? IREG - Warsaw, 16-17 May 2013
Method (1) 1/ Select 4 of the more popular university rankings: ARWU, THE, QS, WEBOmetrics 2/ Pick the most recent system benchmarking: U21 3/ Compare their results IREG - Warsaw, 16-17 May 2013
Method (2) Issue: How to compare U and Systems? Solution: Translate U rankings into Country Rankings Method: From: number of top universities to: number of tertiary aged youths in one country potentially served by top universities in that country (e.g. supply of top universities) NB: no correlation between the 2 measures IREG - Warsaw, 16-17 May 2013
NB: Number of Top 400 U and Supply of Top 400 U (THE) : Rank) IREG - Warsaw, 16-17 May 2013
Method (3) Quick look at the 4 leagues selected The “sample”: Top 400 universities IREG - Warsaw, 16-17 May 2013
Comparing the results of the 4 Rankings (1) Correlation between results of the 4 leagues: (Number of top universities in each country) IREG - Warsaw, 16-17 May 2013
Comparing the results of the 4 Rankings (2) Correlation between results of the 4 leagues: (1) number of top universities in each country IREG - Warsaw, 16-17 May 2013
Comparing the results of the 4 Rankings (3) Correlation between results of the 4 leagues: (2) Supplyof top universities IREG - Warsaw, 16-17 May 2013
Supply: Nbr of top U/ TE aged population The first five countries QS ARWU THE WEBO 1 Finland 16.1 6.9 11.5 9.2 2 New Zealand 14.5 4.8 14.5 2.4 3 Switzerland 13.4 11.8 13.4 11.8 4 Ireland 13.3 8.0 13.3 5.3 5 Denmark 11.5 9.2 11.5 9.2 The last five countries QS ARWU THE WEBO 30 Poland 0.3 0.5 0.5 0.8 31 Mexico 0.1 0.1 0.1 0.1 32 Brazil0.1 0.2 0.1 0.4 33 China 0.1 0.1 0.1 0.2 34 India 0.04 0.01 0.02 0.01
Benchmarking: “U 21”Method (1) 1/ A priori selection of 48 countries ( +2) 2/ Assessment of countries’ performance based on one overall indicator and 4 “measures”: (1) Resources (2) Environment (3)Connectivity (4)Output IREG - Warsaw, 16-17 May 2013
Benchmarking: Method (2) • Resources (25%): 5 indicators on expenditures (2) Environment (25%): 2 indicators on gender balance, 1 indicator on data quality, 3 indicators on policy and regulatory environment, 1 homegrown index on internal governance IREG - Warsaw, 16-17 May 2013
Benchmarking: Method (3) (3) Connectivity (10%): 2 indicators on degree of internationalization (students & research) (4) Output (40%): 5 indicators on research, 1 indicator on Probability of a person to attend a top 500 university (*) based on ARWU… 1 indicator on enrollment 1 indicator on tertiary educated population 1 indicator on unemployment among tertiary educated population IREG - Warsaw, 16-17 May 2013
Benchmarking: Links between the 5 measures IREG - Warsaw, 16-17 May 2013
Comparing Results of Rankings and Benchmarking (1a) Countries Overlap between UR and SB: U21 & THE: 37 common countries U21 & QS: 40 common countries U21 & ARWU: 37 common countries U21 & WEBO: 41 common countries • Essentially same pool of countries IREG - Warsaw, 16-17 May 2013
Comparing Results of Rankings and Benchmarking (1b) IREG - Warsaw, 16-17 May 2013
Comparing Results of Rankings and Benchmarking (2) IREG - Warsaw, 16-17 May 2013
Comparing Results of Rankings and Benchmarking (3) U21 (Overall) and THE Rankings (R2= 0.74) IREG - Warsaw, 16-17 May 2013
Comparing Results of Rankings and Benchmarking (4) U21 (Resources) & ARWU (Supply): R2 = 0.78 IREG - Warsaw, 16-17 May 2013
Conclusions /Interpretation 1/ Hypothesis not confirmed: a/ same set of countries b/ similar results 2/ Two types of explanations: a/ methodological b/ structural IREG - Warsaw, 16-17 May 2013
Epilogue • System Benchmarking ends up ranking countries • Boundaries between UR and SB are blurred • SB suffers common symptoms with UR • Convergence of the two streams of “Rankology” not surprising • Benchmarking needs to expand its pool of countries to become more relevant IREG - Warsaw, 16-17 May 2013
Take Away IREG - Warsaw, 16-17 May 2013
Thank You SB UR IREG - Warsaw, 16-17 May 2013