230 likes | 304 Views
OVERCOMING APPLES TO ORANGES: ACADEMIC RESEARCH IMPACT IN THE REAL WORLD. Netspeed 2014. Thane Chambers Faculty of Nursing & University of Alberta Libraries Leah Vanderjagt University of Alberta Libraries. Welcome. Who we are What we’ll cover
E N D
OVERCOMING APPLES TO ORANGES: ACADEMIC RESEARCH IMPACT IN THE REAL WORLD Netspeed 2014 Thane Chambers Faculty of Nursing & University of Alberta Libraries Leah Vanderjagt University of Alberta Libraries
Welcome • Who we are • What we’ll cover • What is the broad context for research performance metrics need in post-secondary? • What’s are some issueswith research performance metrics? • How can we improveresearch performance assessment?
What is this all about? A brief history of research evaluation
Fundamental principles of research metrics: • Single indicators are misleading on their own • Integration of both qualitative & quantitative data is necessary • Various frameworks for research performance already exist
Fundamental principles of research metrics: • Times cited ≠ Quality • Discipline to discipline comparison is inappropriate • Citation databases cover certain disciplines better than others
Broad/Global Context • Recent upswing in interest about research evaluation • Nationwide assessments in UK, Australia, New Zealand • An audit culture is growing
Metrics wanted • For what? • Performance (strength and weakness) • Comparison (with other institutions) • Collaboration (potential and existing) • Traditional metrics • Altmetrics
Who’s in the game: consumers • Senior university administrators • Funding agencies • Institutional funders • Researchers • Librarians
Who’s in the game: producers (vendors) • Elsevier: Scopus and SciVal • Thomson Reuters: Web of Science and InCites • Digital Science: Symplectic Elements • Article-level metrics (altmetrics) solutions
Vendor Claims • Quick, easy, and meaningful benchmarking • Ability to devise optimal plans • Flexibility • Insightful analysis to identify unknown areas of research excellence …...All with a push of a button!
What’s needed: Persistent Identifiers • Without DOIs, how can impact be tracked? • ISBNs, repository handles • Disciplinary and geographic differences in DOI coverage: DOI assignment costs $$ • What about grey literature? • Altmetrics still may depend on DOI
WHAT’S NEEDED: NAME DISAMBIGUATION (THE BIGGEST PROBLEM)
What’s needed: Source Coverage • Source coverage in most prominent products are still Scopus and Web of Science (STEM-heavy) • Integration of broader sources is packaged with more expensive implementations • Some products specifically market broad source coverage (Symplectic Elements)
What’s needed: National Subject Area Classification (TO A FINE LEVEL) • Subject classification in products is EXTREMELY broad - so broad, comparisons are inappropriate • Integration of a national standard of granular subject classification would help everyone
SUBJECT CLASSIFICATION example http://www.rcuk.ac.uk/RCUK-prod/assets/documents/documents/ResearchAreasProposalClassificationsList.pdf
What’s needed: Training & Knowledge • Do all CONSUMERS want/need training? • Have we analyzed our services for citation impact and metrics analysis? • Top-to-bottom organizational training couched in strategic needs for metrics identified
What’s needed: Processes & Workflows • Data cleaning • Verification of new data • Running analysis • Verifying analysis
What’s needed: Cultural Understanding • How is the data going to be used? And who will be rewarded? • An audit culture • Social sciences, humanities, arts would have justified concerns with the adoption of tools that are citation based
How can academic libraries help? • Share our knowledge of best practices/other effective implementations • Challenge vendors to address problems • Train for author ID systems and assignment and integrate author IDs with digital initiatives
How can academic libraries help? • Advocate for national comparison standards (CASRAI) • Employ our subject-focused outreach model • As a central unit, make broad organizational connections to help with implementation • Promote our expertise: bibliographic analysis is an LIS domain
Recommendations • Strategic leaders need to initiate university wide conversations about what research evaluation means for the institution • Tools need to be flexible to incorporate non-Journal based scholarly work/data • New workflowsneed to be minimized and incorporated into existing workflowsas much as possible • Broad adoption of ORCID ID system
References Marjanovic, S., Hanney, S., & Wooding, S. (2009). A Historical Reflection on Research Evaluation Studies, Their Recurrent Themes and Challenges. Technical Report. RAND Corporation. Moed, H.F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.