190 likes | 292 Views
Research Reviews in the Netherlands A Practical Approach. Roel Bennink, coordinator research reviews Quality Assurance Netherlands Universities www.qanu.nl. Contents. What? The Dutch System of Research Reviews Why? Aims and Owners of the System How? 1993-2003 and 2003-2009.
E N D
Research Reviews in the Netherlands A Practical Approach Roel Bennink, coordinator research reviews Quality Assurance Netherlands Universities www.qanu.nl
Contents • What?The Dutch System of Research Reviews • Why?Aims and Owners of the System • How? 1993-2003 and 2003-2009
Universities in the Netherlands • Fourteen research based universities (incl. OU) • 4.8 billion Euro; 50.000 staff, 160.000 students
Funding of Research Three funding sources • Direct funding (60%) • Main source, stable, plus extra for dissertations and research schools (each about 10%) • Competitive funding (10%) • Contracts (30%)
Research Reviews in NL • All publicly funded research must be submitted for external review every 6 years. • System started in 1993, organised by the universities collectively (VSNU) • Nationwide, per discipline (± 35) • New Protocol in 2003: SEP • Not always nationwide • (individual) Boards are responsible
Standard Evaluation Protocol • Internal and external objectives combined: • Improving quality of research • Improving research management & leadership • Accountability to government and society • Object of the assessment: • Research quality by international standards • Depending on mission per Institute/Programme: • Social or economical objectives • Technical or infrastructural objectives
Standard Evaluation Protocol • Four main aspects: • Quality • International recognition and innovative potential • Productivity • Scientific output • Relevance • Scientific and socio-economic impact • Prospects • Flexibility, management, leadership
Five point scale • Excellent:- internationally leading - important and substantial impact • Very Good: - internationally competitive, national leader - significant contribution 3.Good: - internationally visible, nationally competitive - valuable contribution 2.Satisfactory:- nationally visible - adds to understanding • Unsatisfactory: - flawed, not worthy of pursuing
Method: Self-analysis and Peer review • Close link between research management, quality control and accountability to higher levels • Multi-purpose data collection • Uniform criteria (all universities use SEP) • (sometimes) Citation analyses • (often) Simultaneous and comparative • (always) Public reports
Descriptive elements: • Mission, leadership, strategy, policies • Research processes (teamwork, supervision, quality control) • Reputation (reviews, awards, citations) • Internal evaluation (management, culture) • External validation (spin-offs, stakeholder survey) • SWOT-analysis • Key publications (list of 5, copies of 3)
Quantitative elements: • Research staff (tenured, non-tenured, PhD, support) per year • Funding (ministry; research councils; contracts) per year • Spending (personnel; other) per year • Results (publications)
Evaluative questions • Documentation • What is missing, what is not needed? • How much effort is spent in producing the self-studies? • Committees • How are committee members selected? • What is a good site visit? What can go wrong? • What determines the quality of the committee reports? • Consequences • What are the effects of the reviews? What measures are taken by faculties/universities/funding agencies/ministries? • Lessons learned • What mistakes can be avoided? • What are critical success factors for research reviews?
Documentation What is missing, what is not needed? • It is always too much and never enough • The SEP-set is what every Institute or Programme should have anyway • Bibliometrics are expensive, time consuming and only useful in some disciplines How much effort is spent in producing the self-studies? • That depends on what you already have • “It’s a lot of work, but very useful”
Committees How are committee members selected? • Proposals by Faculties, approval by Boards (and QANU) • Members must be independent and unbiased internationally acknowledged experts What is a good site visit? • Honest and open discussions, well-prepared peers • Critical and constructive questions; good teamwork What determines the quality of the reports? • Public nature, feedback loop • Comparative overview; general chapters per subfield • Combination of scores and text (including recommendations) • Directed at management, not ministry
Consequences What are the effects and measures? • Visibility is increased • Management dialogues are enhanced • Management information improves • Publishing in high impact international journals is stimulated • Groups are merged, extended, redirected or stopped • High marks are (sometimes) financially rewarded • Low marks lead to critical questions • Recommendations are taken seriously • Ministry is kept at a distance
LessonslearnedinNL What mistakes can be avoided? • Don’t assess individuals, only groups • Don’t focus too much on the scores only, or on ranking • Don’t ask too far-reaching (‘strategic’) questions What are the critical success factors? • Keep it simple • Stay close to real organisational structures and management processes • Agree on uniform definitions and data • Cooperate with other universities • Build/buy and maintain research information systems (METIS RU).
Weaknesses • Committees find it difficult to assess the Institute level • Scoring the ‘management’ was abolished • Central scheduling was abolished • Costs (time & money) remain an issue • Tools for measuring “exchange of knowledge” need further development • Managers (and journalists and politicians) attach too much value to rankings
Strengths • SEP works as a basic and practical tool • Peer review is authoritative about programmes (main strength) • External reviews are a useful addition to quality assurance • External reviews shift power to faculty and university • Policy decisions at the responsible level are supported • Reviews are used to look ahead • Peers from abroad add international dimension to quality • Cooperation between universities facilitates benchmarking