330 likes | 342 Views
This seminar discusses the science of science policy and proposes a conceptual framework for research evaluation. It explores empirical approaches used by organizations like NSF Engineering Dashboard, ASTRA in Australia, and HELIOS in France. The session also highlights the importance of linking administrative and grant funding data, and describes public-private partnerships in France.
E N D
Building a Scientific Basis for Research Evaluation Rebecca F. Rosen, PhD Senior Researcher Research Trends Seminar October 17, 2012
Outline • Science of science policy • A proposed conceptual framework • Empirical approaches: • NSF Engineering Dashboard • ASTRA – Australia • HELIOS – France • Final thoughts
Outline • Science of science policy • A proposed conceptual framework • Empirical approaches: • NSF Engineering Dashboard • ASTRA – Australia • HELIOS – France • Final thoughts
The emergence of a science of science policy • Jack Marburger’s challenge (2005) • Science of Science & Innovation Policy Program at the National Science Foundation (2007) • An emerging, highly interdisciplinary research field • Science of Science Policy Interagency Task Group publishes a “Federal Research Roadmap” (2008): • The data infrastructure is inadequate for decision-making • STAR METRICS (2010)
Why a science of science policy? • Evidence-based investments • Good metrics = good incentives • Science is networked and global • Build a bridge between researchers and policymakers • Researchers ask the right questions • The adjacent possible: leverage existing and new research and expertise • New tools to describe & measure communication
Getting the right framework matters • What you measure is what you get • Poor incentives • Falsification • Usefulness • Effectiveness
A proposed conceptual framework Adapted from Ian Foster, University of Chicago
A framework to drive person-centric data collection • WHO is doing the research • WHAT is the topic of their research • HOW are the researchers funded • WHERE do they work • With WHOM do they work • What are their PRODUCTS
Challenge – The data infrastructure didn’t exist However, some of the data do exist
Empirical Approaches Leveraging existing data to begin describing results of the scientific enterprise
An empirical approach • Enhance the utility of enterprise data • Identify authoritative “core” data elements • Develop an Application Programming Interface (API) • Data platform that provides programmatic access to public (or private) agency information • Develop a tool to demonstrate value of API
Topic modeling: Enhancing the value of existing data Automatically learned topics (e.g.): … t6. conflict violence war international military … t7. model method data estimation variables … t8. parameter method point local estimates … t9. optimization uncertainty optimal stochastic … t10. surface surfaces interfaces interface … t11. speech sound acoustic recognition human … t12. museum public exhibit center informal outreach t13. particles particle colloidal granular material … t14. ocean marine scientist oceanography … … NSF proposals • Topic Model: • Use words from • (all) text • Learn T topics t49 t18 t114 t305 Topic tags for each and every proposal David Newman - UC Irvine
Stepwise empirical approach • Enhance the utility of enterprise data • Identify authoritative “core” data elements • Develop an Application Programming Interface (API) • Data platform that provides flexible, programmatic access to public (or private) agency information • Develop a tool to demonstrate value of API
Stepwise empirical approach • Enhance the utility of enterprise data • Identify authoritative “core” data elements • Develop an Application Programming Interface (API) • Data platform that provides programmatic access to public (or private) agency information • Develop a tool to demonstrate value of API
Outline • Science of science policy • A proposed conceptual framework • Empirical approaches: • NSF Engineering Dashboard • ASTRA – Australia • HELIOS – France • Final thoughts
Outline • Science of science policy • A proposed conceptual framework • Empirical approaches: • NSF Engineering Dashboard • ASTRA – Australia • HELIOS – France • Final thoughts
Outline • Science of science policy • A proposed conceptual framework • Empirical approaches: • NSF Engineering Dashboard • ASTRA – Australia • HELIOS – France • Final thoughts
Describing public-private partnerships in France People People
What does getting it right mean? • A community driven empirical data framework should be: • Timely • Generalizable and replicable • Low cost, high quality • The utility of “Big Data”: • Disambiguated data on individuals • Comparison groups • New text mining approaches to describe and measure communication • ??
Policy makers can engage SciSIP communities: • Patent Network Dataverse; Fleming at Harvard and Berkeley • Medline-Patent Disambiguation; Torvik & Smalheiser at U Illinois) • COMETS (Connecting Outcome Measures in Entrepreneurship Technology and Science); Zucker & Darby at UCLA
The power of open research communities • Internet and data technology can transform effectiveness of science: • Informing policy • Communicating science to the public • Enabling scientific collaborations • Interoperability is key • Publishers are an important part of the community
THANK YOU! Rebecca F. Rosen, PhD E-Mail: rrosen@air.org 1000 Thomas Jefferson Street NWWashington, DC 20007 General Information: 202-403-5000TTY: 887-334-3499 Website: www.air.org