240 likes | 265 Views
Citation Counting, Citation Ranking, and h -Index of HCI Researchers: Scopus vs. WoS. Lokman I. Meho and Yvonne Rogers Network and Complex Systems March 24, 2008. Why citation analysis?. Study the evolution of scientific disciplines
E N D
Citation Counting, Citation Ranking, and h-Index of HCI Researchers: Scopus vs. WoS Lokman I. Meho and Yvonne Rogers Network and Complex Systems March 24, 2008
Why citation analysis? • Study the evolution of scientific disciplines • Examine and/or map the social, economic, political, and intellectual impact of scientific research • Assist in certain decisions (promotion, tenure, hiring, grants, collaboration, etc.)
Research problem • Until today, most citation-based research rely exclusively on data obtained from the Web of Science database • Emergence of Scopus and Google Scholar has raised many questions regarding the use of Web of Science exclusively
Literature review • The question of whether to use Scopus and/or Web of Science as part of a mapping or research assessment exercise might be domain-dependent and that more in-depth studies are needed to verify the strengths and limitations of each source • Scopus covers 84% of all journal titles indexed in Web of Science; Web of Science covers 54% of all journal titles indexed in Scopus
Research questions • How do the two databases compare in their coverage of HCI literature and the literature that cites it, and what are the reasons for the differences? • What impact do the differences in coverage between the two databases have on the citation counting, citation ranking, and h-index scores of HCI researchers? • Should one or both databases be used for determining the citation counting, citation ranking, and h-index scores of HCI researchers?
Significance/value of study • Determine whether citation searching in HCI should be extended to both Scopus and Web of Science or limited to one of them. • Will help people who use citation analysis for research evaluation and mapping exercises justify their choice of database
Databases • Web of Science • Approximately 9,000 journals going back to 1955 • Books in series and an unknown number of conf. proceedings, including LNCS, LNAI, LNM • Scopus • 14,000 journals going back to 1996 for citations • 500 conference proceedings • 600 trade publications
Methods • Sample • 22 top HCI researchers from the Equator Interdisciplinary Research Collaboration, a project funded by UK’s Engineering and Physical Sciences Research Council (six years) • Publications (n=1,440, mainly conf papers and journal articles) • 594 (41%) were covered by Scopus • 296 (21%) were covered by Web of Science • 647 (45%) were covered by both
Methods, cont’d • Searching methods used to identify citations to the 1,440 items published/produced by the sample members: • Scopus: (1) exact match of each item in “References” field; (2) “More” tab; and (3) “Author” search results + “Cited by” • WoS: cited references search • Citation information was parsed by author, publication type, year, source name, institution, country, and language • Source names were manually standardized and missing institutional affiliation and country information (3%) was gleaned from the web
Methods, cont’d • Data from both databases were cross-examined for accuracy • h-index • Definition, strengths, and limitations • System-based counting method (takes into account only indexed works, n=647 works) • Manual-based counting method (takes into account all 1,440 works published/produced by sample)
Results: Distribution of unique and overlapping citations Scopus n=6,919 (93%) Web of Science n=4,011 (54%) 3,428 (46%) 3,491 (47%) 520 (7%) WoS Scopus = 7,439* *Excludes 255 citations from WoS, published before 1996
Results: Reasons for the significant differences Note: 76% of all citations found in conference proceedings were unique to a single database, in comparison to 34% in the case of citations in journals
Results: Quality of Scopus unique citing journals This is a partial list of the top 20 citing journals
Results: Quality of Scopus’s citing conference proceedings (top 9 citing titles) *Source: Scopus.
Differences in citation counting and ranking of individual researchers (top 12)
Differences in mapping scholarly impact of individual researchers: an example
Differences in mapping scholarly impact of individual researchers, cont’d *Percentage of mismatch would have been higher had we removed citations from the home institution of the researcher
Difference in h-index of individual researchers This is a partial list of the top 10 researchers
Conclusions and implications • In HCI, conference proceedings constitute a major channel of written communication • Most of these proceedings are published by ACM and IEEE and also by Springer in the form of LNCS and LNAI • Scopus should be used instead of WoS for citation-based research and evaluation in HCI
Conclusions and implications, cont’d • h-index should be manually calculated rather than relying on system-generated scores • Researchers can no longer limit themselves to WoS just because they are familiar with it, have access to it, or because it is the more established data source • A challenge is to systematically explore citation data sources to determine which one(s) are better for what research domains
Conclusions and implications, cont’d • Principles of good bibliometrics research: • Analysis should be applied only by professional people with theoretical understanding and thorough technical knowledge of the databases, retrieval languages, and the abbreviations, concepts, and/or terminologies of the domain under investigation • Analysis should only be used in accordance with the established principles of “best practice” of professional bibliometrics • If utilized for research assessment purposes, citation-based information should only be used in conjunction with qualitative peer review-based information
Thank You Questions? meho@indiana.edu Full paper available at: http://www.slis.indiana.edu/faculty/meho/meho-rogers.pdf Network and Complex Systems