440 likes | 690 Views
Scopus - An Overview. Presented by Virginia Chiu. Agenda. How did we develop Scopus ? Why Scopus?. Agenda. How did we develop Scopus ? Why Scopus?. How did we develop Scopus. Why Develop Scopus ?. Navigation is the Next Big Thing: There is simply too much information available
E N D
Scopus- An Overview Presented by Virginia Chiu
Agenda • How did we develop Scopus? • Why Scopus?
Agenda • How did we develop Scopus? • Why Scopus?
Why Develop Scopus? • Navigation is the Next Big Thing: • There is simply too much information available • And too little time to search it all • On the web, in databases, in libraries • Users and librarians told us they want • A simple, single entry-point to the world’s scientific information • Easy to use • Combining official publications and everything on the web • Integrated with other library resources • And with the full text only one click away • Elsevier wants to supply researchers with workflow tools that increase their productivity
Starting from the users’ needs • If we understand the researcher workflow we can design better products • So we significantly invest in user-based design
?X! How do users cope with this complex environment?
Searching the four domains Websites and digital archives Patents Peer reviewed literature Institutional repositories Science Medicine Technology Social sciences
How we conduct usability testing • Sit together at user’s site • Use combination of functional prototype and static pages • One hour structured interview • Discuss professional background, current research, level of computer expertise, information sources they use • Let user explore the prototype, doing searches, minimal prompting • Go through specific parts of the product and let user do specific tasks, stimulate ‘thinking aloud’ • User carries out work and explains
Learned to facilitate the major tasks • Finding new articles in a familiar subject field • Finding author-related information • articles by a specific author • information that would help in evaluating a specific author • Staying up-to-date • Getting an overview or understanding of a new subject field
Content Selection Criteria • Content Selection Committee (consisting of 20 scientist and 10 subject librarians) installed • Suggest new titles/sources • Yearly approval of title list • Contribute to overall strategy • Important criteria • At least abstract in English • Regular publication • Peer review/quality
Scopus for Researchers • Designed and developed with users to meet their needs: • better navigation through the research literature • easy evaluation of scientific information • Researchers want to find the information they need, not become expert searchers • They want a tool that’s as easy to use as web search but delivers precise results • That takes them to the full-text article they’re subscribed to in just one click
Agenda • How did we develop Scopus? • Why Scopus
用 Google 搜尋, 超過五百萬筆資料?!
用 Google Scholar搜尋, 超過十萬筆資料?!
Web & Patent Citations • Scopus records now link to Cited By for… • Cited By – Web Sources • Cited By – Patents • Sources include:
Why is it important? • Leads researchers to relevant web and patent information that might have been missed otherwise • Expands the available content for users • It is an additional quality indicator: • Thesis and dissertations • Preprint servers • Patents – These are high quality sources • It makes a clear distinction between peer-reviewed cited by’s (Scopus) and non-peer reviewed cited by’s (Web & Patent)
Evaluating scientific research output Why is evaluation so important? Case study – evaluating an author
Funding allocations • Grant Allocations • Policy Decisions • Benchmarking • Promotion • Collection management Why do we evaluate scientific output • Government • Funding Agencies • Institutions • Faculties • Libraries • Researchers
Criteria for effective evaluation • Objective • Quantitative • Relevant variables • Independent variables (avoid bias) • Globally comparative
Important to get it right Why do we evaluate authors? • Promotion • Funding • Grants • Policy changes • Research tracking
Citation counts • Article counts • Usage counts Data requirements for evaluation • Broad title coverage • Affiliation names • Author names • Including co-authors • References • Subject categories • ISSN (e and print) • Article length (page numbers) • Publication year • Language • Keywords • Article type • Etcetera … There are limitations that complicate author evaluation
Data limitations • Author disambiguation • Normalising Affiliations • Subject allocations may vary • Matching authors to affiliations • Deduplication/grouping • Etcetera Finding/matching all relevant information to evaluate authors is difficult
The Challenge: finding an author • How to distinguish results between those belonging to one author and those belonging to other authors who share the same name? • How to be confident that your search has captured all results for an author when their name is recorded in different ways? • How to be sure that names with unusual characters such as accents have been included – including all variants?
The Solution: Author Disambiguation We have approached solving these problems by using the data available in the publication records such as • Author Names • Affiliation • Co-authors • Self citations • Source title • Subject area … and used this data to group articles that belong to a specific author
Enter name in Author Search box Step 1: Searching for an author Professor Chua-Chin Wang National Sun Yat-sen University 組別: 系統晶片組學術專長:積體電路設計、通信界面電路設計、類神經網路實驗室名稱:VLSI設計實驗室研究室分機: 4144
Available information Which author are you looking for? Step 2: Select Professor Wang
Step 3: Details of Professor Wang Unique Author ID & matched documents
No 100% recall… The same author with different author ID’s
Why were these not matched? • Quality above all: • Precision (>99%) was given priority over recall (>95%) • Not enough information to match with enough certainty • For instance affiliations missing or different departments, and all different co-authors or no co-authors, no shared references • As there are many millions of authors there will be unmatched papers and authors
Feedback loop includescheckby dedicated team to insure accuracy Dedicated team investigating feedback requests to guarantee quality
… we have matched the author to documents – now what? Instant citation overview for an author Evaluation Data
Step 4: The citation overview Excluding self citations
But not: X
Conclusion • Search has had a significant impact on how researchers work and scientific publishing • Scientists have very specific needs and rely heavily on their ability to find the information they need • General web search engines are not the answer • Enable users to get the most out of large content collections without needing knowledge of syntax • Ensure the discovery tool fits with the researcher's workflow