240 likes | 308 Views
Looking past the usual metrics to help researchers demonstrate Excellence to support Grant applications. Dr Peter Darroch SciVal Consultant p.darroch@elsevier.com. World view – Colin Macilwain , Nature, Vol 5, 2013 http://www.nature.com/nature/journal/v500/n7462/index.html#col.
E N D
Looking past the usual metrics to help researchers demonstrate Excellence to support Grant applications Dr Peter Darroch SciVal Consultant p.darroch@elsevier.com
World view – Colin Macilwain, Nature, Vol 5, 2013http://www.nature.com/nature/journal/v500/n7462/index.html#col Halt the avalanche of performance based metrics “Bogus measures of ‘scientific quality’ can threaten the peer-review system” “The increasing dominance of quantitative research assessment threatens the subjective values that really matter in Academia” Comment on Snowball metrics “I suspect that in practice, however, it will end up being used mainly to exercise yet more control over academic staff, with every aspect of their professional lives tracked on the system.” Response from Glenn Swaford, Oxford University “However his portrayal of Project Snowball does not ring true with us here at Oxford. We are founding participants in this Project. Snowball puts HEIs in the driver's seat. We (not others) elect what to measure and why. Oxford is involved to assist academic-led planning at a unit level. There is and will be no 'central' much less local use of these data to assess individuals.
Coming up… • Conditions for a good metric and factors that can affect the value of citation metrics • A model to help you select appropriate metrics • Showcasing excellence to support for example a grant application • An example of useful metrics for showcasing a senior researcher • An example of useful metrics for showcasing a junior researcher • What currently happens? • Health warning • Using metrics is not black and white • This session is a discussion about potential uses • There are always exceptions, so always engage your brain!!
Conditions for a good metric Transparent underlying data – Can you trace the data. Is there authority and accountability in the data set (related to 5) External validity of metric – Needs theoretical connection to what you are trying to measure which is not always clear Reliable – Query several times and get same or similar result Easy to replicate – Some metrics are based on complicated algorithms/calculations Hard to distort – Needs to be structural and human systems in place to prevent distortion/gaming Adapted from Scholary Kitchen podcast, July 10th 2013. Phil Davis – Bibliometrics in an age of abundance • http://scholarlykitchen.sspnet.org/2013/07/10/scholarly-kitchen-podcast-bibliometrics-in-an-age-of-abundance/
Factors that can affect the value of citation metrics • Variety in the sizeof entities within the data set • Several disciplineswithin the data set • Multiple publication types within the data set • Coverageof data source, by geography and/or discipline • Ease of manipulation • Qualityof performance Accounting for these Reveals this
A model to help you select appropriate metrics Based on four questions
Q1 What am I trying to achieve? This may be the most difficult part of any process. Your question/goal should drive the data and metrics that you use, not the other way round Example questions/goals: • How can I show I deserve this award/grant? • Which of several applicants would be a good fit with our existing group? • How can I show that I should be promoted or get tenure? • How can I attract more students/researchers to my group? Researchers will experience the reverse direction also: • Funders/line managers using metrics as one input into decisions (in addition to opinion and peer review) Note: Metrics don’t need to be about top down evaluation or benchmarking
Evaluations/showcasing can fall into 3 types Distinguishing between performance: “looking for the best” Demonstrating excellence: “showing off” Modeling scenarios: “fantasy academia” What if I…? Will I have the best chance of a positive outcome if I invest in X or in Y? How can I showcase Z to look the best possible? TYPICAL QUESTION Average of all publications e.g. Citations per Publication Highlight the few top publications in a data set e.g. Publications in Top Percentiles Depends… USEFUL APPROACH Snowball Metric; www.snowballmetrics.com/metrics
Evaluations/showcasing can fall into 3 types Distinguishing between performance: “looking for the best” Demonstrating excellence: “showing off” Modeling scenarios: “fantasy academia” What if I…? Will I have the best chance of a positive outcome if I invest in X or in Y? How can I showcase Z to look the best possible? TYPICAL QUESTION Average of all publications e.g. Citations per Publication Highlight the few top publications in a data set e.g. Publications in Top Percentiles Depends… USEFUL APPROACH Snowball Metric; www.snowballmetrics.com/metrics
Q2: What am I looking to evaluate or showcase? • Institution / group of / discipline within • Country / group of / discipline within • Research Area / group of • Researcher / group of • Publication Set / group of • Awards program of a funding agency • Effectiveness of policy • Etc.
Q3: How will I recognise/demonstrate good performance? A: usually, relative to peers that you have selected to be • Distinguishing: equivalent to and a bit above your status • Demonstrating excellence: equivalent to and somewhat below your status A few considerations that may affect your selection of peers • Size – of publication output / student program / funding • Status – recognition / league tables / reputation / influence • Disciplinary focus • Geographic location • Degree of focus on research or teaching • Comparators for your university, or for a department or academic
Q4: Which metrics could help me make my decision? This list displays metrics being developed by SciVal Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Collaboration metrics Number of Authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration Academic-Corporate Collaboration Citation Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact Academic-Corporate Collaboration Impact Disciplinarity metrics Journal count Category count
Showcasing excellence to support for example a grant application
Which metrics help me showcase performance? This list displays metrics being developed by SciVal Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Collaboration metrics Number of Authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration (geographical) Academic-Corporate Collaboration Productivity metrics.Useful to make big entities look good. Unfair if you are comparing entities of different sizes. Few recent citations. Citation Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact (geographical) Academic-Corporate Collaboration Impact Disciplinarity metrics Journal count Category count Snowball Metric; www.snowballmetrics.com/metrics
Extensive Citation Impact metrics address many needsThis list displays metrics being developed by SciVal Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Collaboration metrics Number of Co-authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration Academic-Corporate Collaboration Field-Weighted Citation Impact is very useful because it accounts for several variables that affect the metric value, and recent values do not drop. But it is not transparent for new users. Citation Count is a “power” metric.Useful to make big entities look good. Unfair if you are comparing entities of different sizes. Few recent citations. Citations per Publication is a size-normalized metric. Useful to compare entities of different sizes. Few recent citations. Citation Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact (geographical) Academic-Corporate Collaboration Impact Disciplinarity metrics Journal count Category count Snowball Metric; www.snowballmetrics.com/metrics
h-index variants emphasize different strengths Examples given for researchers onlyThis list displays metrics being developed by SciVal An h-index of 7, means that 7 of a papers have each been cited at least 7 times. For researchers, it is useful to indicate both productivity and citation impact. It is not useful for new researchers with few citations. Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Collaboration metrics Number of Authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration Academic-Corporate Collaboration Citation Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact (geographical) Academic-Corporate Collaboration Impact g-index emphasizes and rewards the most highly cited papers. Is always the same as or higher than h-index. For researchers, it is good to emphasize exceptional papers. It is not useful for average researchers where h=g, or for new researchers. Disciplinarity metrics Journal count Category count m-index is h-index per year of publishing activity. It ‘levels’ the playing field for researchers with different career lengths. It is not useful for researchers who have had a career break. Snowball Metric; www.snowballmetrics.com/metrics
Not all Citation Impact metrics need the data set to have citations!This list displays metrics being developed by SciVal Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Collaboration metrics Number of Co-authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration Academic-Corporate Collaboration Publications in Top Percentiles . Useful to distinguish between entities whose averages are similar, and to show off. Not always inclusive of average entities, and time is needed for citations to be received. Citation Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact (geographical) Academic-Corporate Collaboration Impact Publications in Top Journal Percentiles. This is a good metric to engage researchers. It is also useful early in a strategy or career because publications do not need their own citations. However, publications are judged based on the average performance of the journal. Disciplinarity metrics Journal count Category count Snowball Metric; www.snowballmetrics.com/metrics
Topical Collaboration metrics have broad valueThis list displays metrics being developed by SciVal Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Collaboration metrics Number of Authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration (geographical) Academic-Corporate Collaboration Collaboration metrics only need the affiliation information that authors have included on their publications. They do not need any citations. They are very useful e.g. at the start of new strategy, or early in a researcher’s career, when publications exist but too little time has passed for citation-based metrics to be reliable. Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact (geographical) Academic-Corporate Collaboration Impact Disciplinarity metrics Journal count Category count Snowball Metric; www.snowballmetrics.com/metrics
Showcasing the performance of a senior researcher • Likely a large body of work available to showcase • Publication volume and total citation counts as well as associated indices (h- and g-index) should work well? Metrics to consider • cpp • Top citation percentiles • Top Journal percentiles • h- and g-index • Collaboration metrics • Demonstrate the expertise of your collaborators who support your research • Demonstrate your network/reach
Showcasing the performance of a junior researcher • Potentially smaller body of work available to showcase • Simple counts perhaps not so useful due to lower volume and maybe not enough time to accumulate the necessary citations • h- and g-index not so useful potentially but m-index? Metrics to consider • cpp • Top citation percentiles • Top Journal percentiles • m-index • Collaboration metrics • Demonstrate the expertise of your collaborators who support your research • Demonstrate your network/reach
A model for selecting metrics with 4 questions What question am I trying to achieve/answer? What am I evaluating/showcasing? How will I recognise good performance? Which metrics will help me?
What currently happens? • How do Academics at your institution currently showcase their expertise? • Grant applications • PDR discussions • Promotion • Do they/you use citations or any metrics? • How do you support Academics around showcasing expertise currently?
Snowball Metrics are a subset of SciVal metricsThis list displays metrics being developed by SciVal Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Collaboration metrics Number of Co-authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration Academic-Corporate Collaboration Citation Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact (geographical) Academic-Corporate Collaboration Impact Snowball Metrics are endorsed by distinguished universities. They are a manageable, convenient way to start using benchmarking data in university strategy. Other metrics allow more sophisticated analysis, and are useful for other entities. Disciplinarity metrics Journal count Category count Snowball Metric; www.snowballmetrics.com/metrics