230 likes | 416 Views
How risk averse is our research funding ?. Reinhilde Veugelers Prof@KULeuven-MSI ; ERC Scientific Council Member. Context of public funding for research. Low growth & austerity leading to shrinking public (research) budgets in many countries, exceptions: China’s expanding budget
E N D
How risk averse is our research funding? Reinhilde Veugelers Prof@KULeuven-MSI; ERC Scientific Council Member
Context of public fundingfor research • Low growth & austerity leading to shrinking public (research) budgets in many countries, exceptions: • China’s expanding budget • Flanders FWO budget • EU FP/Horizon budget • Monitoring, evaluationof public research budgets • More emphasis on (measuring) impact of (public) research (funding) on society • More emphasis on contribution of public research to (local) economic& societal development • Public funders more short-term impact oriented, • Public funders more risk averse…
Risk aversion in funding • Selection procedures increasingly accused of favoring “safe” projects which exploit existing knowledge at expense of novel projects that explore untested waters Yet: • Fundamental reason for government support of public research is to promote risk taking (Arrow 1962)
James Rothman, 2013 Nobel Laureate in Physiology/Medicine, Comments on Risk Taking Rothman told interviewer that “he was grateful started work in the early 1970s when the federal government was willing to take much bigger risks in handing out funding to young scientists” “I had five years of failure, really, before I had the first initial sign of success. And I’d like to think that that kind of support existed today, but I think there’s less of it. And it’s actually becoming a pressing national issue, if not an international issue.” Nobel Laureate, Physiology or Medicine, 2013 Interview on NPR
Roger Kornberg on Risk Taking • To quote Nobel laureate Roger Kornberg, “If the work that you propose to do isn’t virtually certain of success, then it won’t be funded.” Paula Stephan, Georgia State University, pstephan@gsu.edu
Greg Petsko Why Columbus’s proposal “Finding a New Route to the Indies by Sailing West” is (hypothetically) rejected Genome Biology 2012 13:155 • Too ambitious—suggest he go to Portugal, instead. • Lack of preliminary data • Failure would be disastrous for funder-- “think of how it would look if we funded something that didn’t pan out.” • Poor fit for reviewers: Experts (da Gamma and Magellan) too busy to review proposal • Limited funds • Funds are used for data collection (“Grape Vine Sequencing”) rather than hypothesis testing—data collection projects are “guaranteed to work”
Variety of Reasons • Focus on funding “successful” projects- • Obsession with “preliminary” findings— “no crystal no grant” • Higher success rate for recurrent, established grant holders • Increased reliance on bibliometric measures—particularly short-term bibliometric measures (Leiden Manifesto) • “Instant bibliometrics for reviewers” JIF, Three-year citation window
Goes Beyond Funding Decisions • Hiring and promotion decisions based in part on short-term bibliometric measures • Allocation of research funds to universities and departments based on such measures • Netherlands, UK, Australia, Finland, Italy, Norway • BOF Key in Flanders
The Case of Novel Science • Scientific breakthroughs advance the knowledge frontier and contribute disproportionally to economic growth. • Research that underpins scientific breakthroughs often requires novel approaches. • Novel research has high risk and is often controversial. • “High risk/high gain” novel research public support. Bias against novel research in funding? Bias against novel research in bibliometric indicators?
Wang, J, Veugelers, R., Stephan, P. 2017, Bias against novelty in science: a cautionary tale for users of bibliometric indicators, Research Policy, 46, 1416-1436. http://voxeu.org/article/bias-against-novelty-science; http://www.nber.org/digest/jun16/w22180.html • Develop a bibliometric measure of novelty: papers making new combinations of journal references, taking into account the difficulty of making such new combinations through the distance between the journals • Study relationship between novelty and citations, using 2001 WoS journal articles. Stephan, Veugelers, Wang, 2017, Evaluators blinkered by bibliometrics, Nature, 544, 411-412.
Findings in a Nutshell • Find a ‘high risk/high gain” profile of novel research • More average citations but also higher variance in citations; • More likely to become top cited (top 1%) • but only when using a long enough time window (at least 4 years); • More likely to stimulate follow-on breakthroughs; • Appreciation of novel research comes from outside its own field; not within its field. • Characteristics one expects if novelty is correlated with breakthrough research • Also find bias against novelty in standard bibliometric indicators • Less likely to be highly cited in typically short-term citation window • More likely to be published in journals with lower Journal Impact Factor
Novelty and Journal Impact Factor Findings in a Nutshell Novel papers are less likely to be published in high JIF journals And even if they get into high JIF, they face a delayed recognition
Implications • Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty • Over-reliance on such measures • Directly discourages novel research that might be of great value. • Indirectly misses follow-on breakthroughs built on novel research. • Findings may help explain why funding agencies who are increasing relying on bibliometric indicators are at the same time perceived as being increasingly risk averse • Results also point to importance of having interdisciplinary panels evaluate research
Implications Funders should not provide (or ask to provide) short-term bibliometric measures and prevent them from being used as decisive in reviews of grant proposals. They should insist on multiple ways to assess applicants’ and institutions’ publications They should resist evaluating the success based on short-term citation counts and journal-impact factors. They should also include experts with outside field expertise. Panel members should resist seeking out and relying too much on metrics, especially when calculated over less than a three-year window
Funding frontier research:someevidencefromtheEuropean Research Council (ERC)
What is ERC? °2007 • Budget: € 13 billion (2014-2020) - 1.9 billion €/year • € 7.5billion (2007-2013) - 1.1 billion €/year • ERC represents 17% of totalEC-H2020 budget • Scientific governance: independent Scientific Council with 22 members; full authority over funding strategy Legislation • Excellence as the only criterion • Support for the individual scientist – no networks! • Global peer-review • No predetermined subjects (bottom-up) • Support of frontier research in all fields of science and humanities Strategy │ 2
ERC’sambitions “the ERC aims at reinforcing excellence, dynamism and creativity in European research by funding investigator-driven projects of the highest quality at the frontiers of knowledge”. “its grants will help to bring about new and unpredictable scientific and technological discoveries - the kind that can form the basis of new industries, markets, and broader social innovations of the future”. “Scientific excellence is the sole selection criterion. In particular, high risk/high gain pioneering proposals which go beyond the state of the art, address new and emerging fields of research, introduce unconventional, innovative approaches are encouraged”.
ERC’s modus operandus • The evaluation of ERC grant applications is conducted by peer review panels composed of scholars selected by the ERC Scientific Council from all over the world; They are assisted by remote referees. • Reviewers are asked to evaluate the proposals on their ground breaking nature, their level of ambition to go beyond the state of the art and push the frontier. • Panels decide on the ranking/who-gets-funded • ERC does not provide or asks for bibliometric indicators (JIF, Citations, • ERC instructs its panel to only consider submitted material (ie not look up/use other information... • Nevertheless, PIs often self-report in their applications (often advised by their host institutions/peers) • Panel members are often found to self-search for bibliometric indicators
Evaluating ERC’s ambition of supporting frontier research? Insights from quantitative analysis • Ex Ante: What does ERC select? Check big impact, novelty, interdisciplinarityof grantees on pre grant publications • Comparing granted vs rejected ERC applicants • Comparing marginally accepted vs marginally rejected ERC applicants • Ex Post: What is the impact of ERC funding? Check big impact, novelty, interdisciplinarityof grantees on post-grant publications • Compared to counterfactual: similar grantees without ERC funding • DifferenceinDifference, Note: ERC only just starting to have finished grants (2007, 2008, 2009)
DiD analysis of Publication profile of All (StG) Selection on High Gain Treatment on High Risk (less risk aversion) Preliminary results (not to be quoted); Control=Rejected applicants Source: Own calculations on basis of ERCEA; Unit of analysis: project. Measures based on 3 years publication profile before/after call year (excl.).
Caveats/Concerns • Novelty only one measure of frontier research; others needed • Not all frontier research is “novel” • Important for public agencies to have a portfolio that includes risk; not all research funded should be risky. Real role for “ditch diggers”
Thank you for you attention Comments very welcome !! Reinhilde.Veugelers@kuleuven.be