190 likes | 206 Views
Uncover EUA's research on rankings, key findings, and impact on institutional strategies. Learn how universities utilize rankings and the challenges they face. EUA members' participation in U-Multirank, survey results, and the importance of comprehensive data for strategic planning. Guidelines on leveraging rankings for institutional advancement.
E N D
Rankings from the perspective of European universities Tia Loukkola Director of Institutional Development EUA General Assembly, April, 2015
EUA’s work on rankings in short • 2009-2010 Council workinggroup • Two reviews withmain focus on ranking methodologies • 2011 Global University RankingsandTheir Impact • 2013 Global University RankingsandTheir Impact – Report II • 2012-2015 RISP project • 2014 Rankings in InstitutionalStrategiesandProcesses: Impact or Illusion? • Mapping EUA members’sparticipation in U-Multirank • 2015 Report on the experiencesfrom the first round …2…
Summary of key RISP findings • While highly critical of rankings, HEIs still use rankings: • Fill information gap • Benchmark • Inform institutional decision-making • Develop marketing material • Institutional processes affected by rankings fall into 4 categories: • Mechanisms to monitor rankings • Clarification of institutional profile and adapting core activities • Improvement to institutional data collection • Investment in enhancing institutional image
Conclusions from RISP • Institutions need to improve their capacity to generate comprehensive, high-quality data and information: • to underpin strategic planning and decision-making • to provide meaningful, comparative information about institutional performance to the public • Rankings can be an important ingredient in strategic planning… nevertheless, it is vital that each university stays “true” to its mission and should not be “diverted or mesmerised” by rankings • The report ends with guidelines on how institutions could use the rankings for strategic purposes
Background to UMR survey • A variety of views among membership • EUA represented in the Advisory Board • First results published in May 2014 • A survey to map views and experiences of individual members on the initiative …5…
EUA members in UMR *47,1% of EUA members in 2014 **59.4% of EUA members in 2015 …6…
UMR in short • Multidimensional ranking with seed-funding from the EC • Indicators cover 5 areas: teaching and learning, research, knowledge transfer, international orientation and regional engagement. • Data • provided by the institutions directly • international bibliometric and patent databases • student surveys (completed by students at participating institutions) • Fields covered by UMR • 2014: business studies, electrical engineering, mechanical engineering and physics • 2015: psychology, computer science and medicine …7…
Survey results: 85 universities having actively participated in dataprovision …8…
Resources required • Considerable resources used to provide data • Only 12% had less than 5 persons involved • Half involved 5-15 persons • 29.2% used more than 30 working days • 20% spent less than 10 days “The field data collection process is very time consuming. There were some difficulties in interpreting some definitions and to adjust them to the national context.” (University from Portugal) …11…
Using results • 60 % are using UMR results for something, out of them …12…
Survey results: 7 universities included in UMR through publicly available data “We cannot see how U-Multirank can overcome the differences in how data is interpreted among universities and between countries.” (University from Sweden) “We had concerns about the validity of the exercise, the cost of data collection and the difficulty of quality assurance for this self-reported data.” (University from United Kingdom) …14…
Key findings • There is increasing interest among EUA members to take part in UMR • Cooperation with UMR consortium worked quite well • Benefits of participation or use of UMR results unclear • Data collection required considerable resources • Concerns over validity of data following difficulties in interpreting indicators • UMR struggles with reliability and comparability of the data -> how to overcome this? …17…
Conclusions • Use of rankings at institutional level not systematic • Developing institutional research capacity is vital • Would we need international or European common dataset? …18…
All publications are available at http://www.eua.be/Publications.aspx …19…