1 / 22

Nancy K. Bayers, Bibliometrician University of Leicester

Bibliometrics (citation analysis) for evaluation and management of research Higher Education Academy Centre for ICS 28 April 2010 Birmingham, UK. Nancy K. Bayers, Bibliometrician University of Leicester. Agenda. Definition Key metrics Paper level metrics Group level metrics

madra
Download Presentation

Nancy K. Bayers, Bibliometrician University of Leicester

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bibliometrics (citation analysis) for evaluation and management of research Higher Education Academy Centre for ICS28 April 2010Birmingham, UK Nancy K. Bayers, Bibliometrician University of Leicester

  2. Agenda • Definition • Key metrics • Paper level metrics • Group level metrics • Selected metrics in Computer Science • Caveats

  3. Definition Bibliometrics is NOT… • …the only tool for assessing research • Peer Review, Research Income, Awards, Editorships • …just the Journal Impact Factor • …just a count of number of papers and citations Bibliometrics IS… • …a toolkit of multiple statistics, based primarily on the journal literature, that offer quantitative indicators of research output and impact

  4. Key metrics

  5. Scientific paper: basic unit TI: Experimental study on population-based incremental learning algorithms for dynamic optimization problems. AU: Yang, SX and Yao, X SO: SOFT COMPUTING     YR: 2005   TC: 42 SC: Computer Science 46

  6. Cites42 Expected citation rate 2.4 All Articles from Soft Computing in2005received on average 2.4 cites through year-end 2008. Ratio actual to expected17.5 FieldComputer Science Field Percentile3.3% The 42 cites to this Computer Science paper place it in the top 3.3% of papers based on the citation distribution to all papers published in this field in 2005. Key Paper-level metrics

  7. Aggregated Metrics

  8. Aggregated metricsuniversities – departments - groups Journal expected cites sum of all actual citations divided by sum of all expected citations Category expected cites sum of all actual citations divided by sum of all expected citations Mean Percentile average of the field percentile measures which are based on field and year of publication Percentiles distribution percentage of papers in each percentile

  9. Comparison across departments

  10. Comparison across universities % papers in the institution 1981-2008

  11. Comparison across universities Impact: average cites per paper 1981-2008

  12. World comparisons % papers in region CH AP EU World US UK

  13. World comparisons impact US UK EU Wld AP CH

  14. # Papers% cited% in uni Various rankings (1981-2008)

  15. Caveats

  16. Citation analysis and Computer Science • Conference Proceedings v. Journal Articles • UK 2000-2010: • 1300 Articles • 15,500 conference proceedings papers

  17. Re-ranking based on Proceedings Papers (1990-2010)

  18. Bibliometrics and SSH (1) • Knowledge dissemination • Sciences: Journal Articles • Social Sciences: Journal articles, Books and Conference Proceedings • Humanities: Books, performances, compositions, designs, artefacts, exhibitions • Research focus • Sciences: International • SSH: International, National, regional and local

  19. Bibliometrics and SSH (2) • Social Sciences: • 45-70% outputs are in journal articles • Role of journals increasing for • Economics: increased globalization • Linguistics • Humanities: • 20-35% outputs are in journal articles • History and Literature: role of journal diminishing even further

  20. Bibliometric tools can help… • Identify papers for REF submission… • Highly cited • Geographic outreach • Hot—disproportionate immediate attention • Identify journals for submission • Analyze current state of research • Provide documentation for grant submission • Identify potential collaborations • Benchmark and compare performance

  21. Caveats • Field Differences • Multiple metrics • Complement and inform peer review • Currently not be suitable for the humanities and some social science fields • Caution when using in fields that communicate via Conferences • Clinical practitioners and policymakers • Are the results reasonable?

  22. Thank you. Nancy K. Bayers Bibliometrician, University of Leicester nb193@le.ac.uk

More Related