1 / 64

R&D evaluation: Opportunities and threats Giorgio Sirilli Research Director

Explore the definitions of research and evaluation, experiences of R&D evaluation, and key lessons learned in this comprehensive presentation. Discover the essential criteria, scope, and techniques for effective evaluation in the field of science and technology policy.

pamelad
Download Presentation

R&D evaluation: Opportunities and threats Giorgio Sirilli Research Director

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Campus Luigi Einaudi R&D evaluation: Opportunities and threats Giorgio Sirilli Research Director

  2. Outline of the presentation Definitions of research and evaluation Some data Evaluation in the context of S&T policy Some experiences of R&D evaluation Lessons learned Concluding remarks

  3. Definitions

  4. Evaluation Evaluation may be defined as an objective process aimed at the critical analysis of the relevance, efficiency, and effectiveness of policies, programmes, projects, institutions, groups and individual researchers in the pursuance of the stated objectives. Evaluation consists of a set of coordinated activities of comparative nature, based on formalised methods and techniques through codified procedures aimed at formulating an assessment of intentional interventions with reference to their implementation and to their effectiveness. Internal/external

  5. The first evaluation (Genesis) The first evaluation In the beginning God created the heaven and the earth. And God saw everything that He had made. “Behold”, God said, “it is very good”. And the evening and morning were the sixthday. And on the seventh day God rested from all His work. His Archangel came then unto Him asking, “God, how do you know that what You have created is ‘very good’? What are Your criteria? On what data do You base Your judgement? Aren’t You a little close to the situation to make a fair and unbiased evaluation?” God thought about these questions all that day and His rest wasgreatlydisturbed. On the eighth day, God said, “Lucifer, go to hell!” (From Halcom’s “The Real Story of Paradise Lost”)

  6. Research and Development (R&D): definition Research and experimental development (R&D) comprise creative work undertaken on a systematic basis in order to increase the stock of knowledge, including knowledge of man, culture and society, and the use of knowledge to devise new applications Basic research Applied research Experimental development

  7. OECD Frascati Manual

  8. R&D performing organisations Private non-profit institutions Higher education Government Business enterprises

  9. The knowledge bundle in higher education and government

  10. The knowledge institutions Highereducation teaching research “thirdmission” Researchagencies research problemsolving management

  11. Some data

  12. R&D resources Italy OECD Science, Technology and Industry Scoreboard, 2015

  13. R&D expenditure/GDP (percentage) - 2013 0.0 1.0 2.0 3.0 4.0 OECD Science, Technology and Industry Scoreboard, 2015

  14. The R&D performing sectors

  15. R&D expenditure by performing sectors, 2013

  16. The context: S&T policy

  17. The context: S&T policy • Vannevar Bush , • “Science the Endless Frontier” 1945

  18. “Science the Endless Frontier” Problems to be addressed and to be solved in the US through science: - defence - health Solution: science policy National Science Foundation

  19. The neo-conservative wave of the 1980s “All areas of public expenditure should demonstrate ‘value for money’” Thatcher’s three Es: economy efficiency effectiveness

  20. The new catchwords New public management Value for money Accountability Relevance

  21. Metodology

  22. Questions Whytoevaluate? Howtoevaluate? What are resultsof the exercise?

  23. Why do we need evaluation? Governments need tools to help determine: • - how much to invest in R&D • in which areas to invest • which institutions/organisations to finance

  24. Types of decisions in research policy • Distribution between sciences (e.g. physics, social sciences) • Distribution between specialties within sciences • e.g. high-energy physics, optical physics • Distribution between different types of activity • e.g. university research, postgraduates, central labs • Distribution between centres, groups, individuals

  25. Scope and object of evaluation Type of research e.g. • - academic research vs targeted research • - international big-science programmes Level and object of the evaluation • - individual researcher • - research group • - project • - programme • - whole discipline

  26. Criteria for evaluation Vary according to the scope and purpose of evaluation; they range from criteria for identifying quality/impact of research to criteria for identifying value for money Four main aspects often distinguished • - quantity • - quality • - impact • - utility Criteria can be • - internal – likely impact on advance of knowledge • - external– likely impact on other S&T fields, economy and society

  27. Research evaluation Evaluation of what: research education “third mission” of universities and research agencies (consultancy, support to local authorities, etc.) Evaluation by whom: experts, peers Evaluation of what: organisations (departments, universities, schools) programmes, projects individuals (professors, researchers, students) Evaluation when ex-ante in-itinere ex-post

  28. Policy objectives

  29. Evaluation: a difficult task

  30. Difficult to evaluate science William Gladstone Michael Faraday William Gladstone, then British Chancellor of the Exchequer (minister of finance), asked Michael Faraday of the practical value of electricity. Gladstone’s only commentary was ‘but, after all, what use is it?” “Why, sir, there is every probability that you will soon be able to tax it.”

  31. Difficult to evaluate science: The case of physicists Bruno Maksimovič Pontekorvo “Physicsis a single discipline butunfortunatelynowadaysphisicistsbelongtotwodifferentsgroups: the theoreticians and the experimentalists. If a thoreticiandoesnotpossesanextraordinaryabilityhis work doesnotmakesense….Forexperimentalistsalsoordinarypeole can do a useful work …” (Enrico Fermi, 1931)

  32. Evaluation experiences

  33. In the UK • Research Assessment Exercise (RAE) • Research Excellence Framework (REF) (impact)

  34. In Italy Evaluation of the Quality of Research (VQR) Model: Research Assessment Exercise (RAE) Objective: Evaluation of Areas, Research structures and Departments (not of researchers) Reference period: 2004-2010 Report: 2014 Actors: - ANVUR - GEV (Evaluation Groups) (#14) (450 experts involved plus referees) - Research structures (universities, research agencies) - Departments - Subjects evaluated: researchers (university teachers and PRA researchers)

  35. Evaluation of the Quality of Research by ANVUR Researchers’ products to be evaluated - journal articles - books and book chapters - patents - designs, exhibitions, software, manufactured items, prototypes, etc. University teachers: 3 “products” over the period 2004-2010 Public Research Agencies researchers: 6 “products” over the period 2004-2010 Scores: from 1 (excellent) to -1 (missing)

  36. Evaluation of the Quality of Research by ANVUR Indicatorslinkedtoresearch: quality (0,5) abilitytoattractresources (0,1) mobility (0,1) internazionationalisation (0,1) high leveleducation (0,1) ownresources (0,05) improvement (0,05) Attentionbasicallyhere!

  37. Evaluation of the Quality of Research by ANVUR Indicatorsof the “thirdmission” : fundraising (0,2) patents (0,1) spin-offs (0,1) incubators (0,1) consortia (0,1) archaeologicalsites (0,1) museums (0,1) otheractivities (0,2)

  38. Multi-dimensional matrix of evaluation

  39. The h-index (Jorge Eduardo Hirsch) In 2005, the physicist Jorge Hirsch suggested a new index to measure the broad impact of an individual scientist’s work, the h-index . A scientist has index h if h of his or her Np papers have at least h citations each and the other (Np − h) papers have ≤ h citations each. In plain terms, a researcher has an h-index of 20 if he or she has published 20 articles receiving at least 20 citations each.

  40. Impact factor (Eugene Fardfield) The impact factor of a journal is a measure that reflects the average number of citations in the previous two years of articles (articles, reviews proceedings, etc.) published in the journal. In plain terms, if a journal has an impact factor of 3 in 2008, the articles published in 2006 and 2007 have received 3 citations each in 2008.

  41. Nobel laureates and bibliometrics (Boson in 2013) Peter Ware Higgs 13 works, mostly in “minor” journal, h-index = 6 Francois Englert 89 works, both in prestigious and minor hournals, h-index = 10 W. S. Boyle h-index = 7 G. E. Smith h-index = 5 C. K. Kao h-index = 1 T. Maskawa h-index = 1 Y. Namby h-index = 17

  42. Performance-Based research funding systems “The rationale of performance funding is that funds should flow to institutions where performance is manifest: ‘performing’ institutions should receive more income than lesser performing institutions, which would provide performers with a competitive edge and would stimulate less performing institutions to perform. Output should be rewarded, not input.” Performance based research funding systems are national systems of ex-post university research output evaluation used to inform distribution of funding. Herbst, 2007

  43. Performance-Based research funding systems Criteria Research (teaching excluded) Evaluation ex-post (ex-ante excluded) Research output Research funding must depend on the results of evaluation National system (internal university evaluations excluded)

  44. Performance-Based research funding systems Share ofuniversityfundingdependent on “Performance-BasedResearchFundingSystems” Hicks D., Reseach Policy (2012)

  45. Performance-Based research funding systems Share of university funding dependent on “Performance-Based Research Funding Systems” “The distributionofuniversityresearchfundingissomethingofanillusion” “Itis the competitionforprestigethatcreatespowerfulincentiveswithinuniversitysystems” “Performance-basedresearchfundingsystemsaim at excellence: theymay compromise otherimportantvaluessuchasequity and diversity” Hicks D., Reseach Policy (2012)

  46. Ranking of universities Four major sourcesof ranking ARWU Shangai (Shangai, JiaoTongUniversity) QS World University Ranking THE University Ranking (TimesHigherEducation) US News e World Reports (Best Global Universities)

  47. Ranking of universities TopUNIVERSITIES Worldwide university rankings, guides & events • Criteria selected as the key pillars of what makes a world class university: • Research • Teaching • Employability • Internationalisation • Facilities • Social Responsibility • Innovation • Arts & Culture • Inclusiveness • Specialist Criteria

  48. Ranking of universities: the case of Italy ARWU Shangai (Shangai, JiaoTongUniversity) QS World University Ranking THE University Ranking (TimesHigherEducation) US News e World Reports (Best Global Universities) ARWU Shangai: Bologna173, Milano186, Padova 188, Pisa 190, Sapienza191 QS World University Ranking: Bologna182, Sapienza202, Politecnico Milano 229 World University Ranking SA: Sapienza95, Bologna99, Pisa 184, Milano193 US News e World Report: Sapienza139, Bologna146, Padova 146, Milano155

  49. Evaluation is an expensive exercise ResearchAssessmentExercise (RAE) 540 million Euro ResearchExcellenceFramework (REF) 1 milllionPounds (500 million) Evaluationof the QualityofResearch (VQR) 300 million Euro (180,000 “products”) 182 million Euro Ruleofthumb: lessthan 1% ofR&D budget devotedtoitsevaluation

  50. Cost of evaluation: the saturation effect A systematic loss Saturation effect Source: Geuna and Martin

More Related