150 likes | 230 Views
All That Data. Finding Useful and Practical Ways to Combine Electronic Resource Usage Data from Multiple Sources Maribeth Manoff, Eleanor Read, and Gayle Baker Library Assessment Conference September 27,2006. MaxData Project.
E N D
All That Data Finding Useful and Practical Ways to Combine Electronic Resource Usage Data from Multiple Sources Maribeth Manoff, Eleanor Read, and Gayle Baker Library Assessment Conference September 27,2006
MaxData Project “Maximizing Library Investments in Digital Collections Through Better Data Gathering and Analysis” Funded by Institute of Museum and Library Services (IMLS) 2005-2007 Carol Tenopir, PI
MaxData Project Purpose • Evaluate and compare methods of usage data collection and analysis • Develop cost/benefit model to help librarians select appropriate method(s) for electronic resource usage assessments
MaxData Project Teams • UT: COUNTER data from vendors, link resolver, database usage logs, federated search engine • David Nicholas et al. (Ciber): deep log analysis on OhioLINK journal usage data • Carol Tenopir and Donald King: readership surveys at UT and four Ohio Universities
Outline • Journal-level data • Vendor reports • Link resolver reports • Database-level data • COUNTER reports • Journal-level totals • Local usage logs • Other systems
Vendor Reports: Background • Vendor-supplied data primary source of e-journal usage information • Project COUNTER helpful, but… • Manipulation may be required to compare use among vendors
Vendor Reports: Consolidating • COUNTER Journal Report 1 (JR1) • Data from each vendor combined in Excel spreadsheet • Facilitates additional analyses • Sorting by selected fields • Subject analysis • Cost per use calculations
Vendor Reports: Challenges • Inconsistencies in data fields • Journal title (articles, upper/lower case, extra information) • ISSN (with and without hyphen) • Time consuming to fix • ScholarlyStats, SUSHI, ERMS may help
Link Resolver Data • Increasing number of libraries using an OpenURL linking service • SFX from Ex Libris includes a statistical module • One report in particular (“Requests and clickthroughs by journal and target”) is somewhat analogous to COUNTER JR1
SFX “Request” and “Clickthrough” Data • UT student searching in an SFX “source” discovers an article of interest • Clicks on FindText button • Article is available electronically in Journal A, Package Y and Z – “Request” statistic recorded for each • Student chooses link to Journal A in Package Y – “Clickthrough” statistic recorded
SFX “Clickthroughs” vs. JR1 “Full-Text Article Requests” • Clickthrough is less specific, does not measure actual download • But, clickthrough is a “known quantity,” not dependent on package interface • SFX report as a useful supplement to JR1, comparing trends and patterns • SFX contains data not in JR1 reports, e.g., non-COUNTER packages, open access journals, backfiles
Formatting the SFX Report • Report from SFX is not formatted like JR1, does contain data elements • Request to software vendor: Include in statistical module • Incorporate into ERMS • Manual or programming approach, depending on time and expertise available
Other Useful Link Resolver Reports and Data • Unmet user needs • Journals “requested” with no electronic full-text available • Interlibrary loan requests • Unused full-text report • Overlap reports • Subject categories
Package-Level Usage Data • Totals from journal-level reports • COUNTER Database Report 1 • Local methods • UT – database “hits” recorded from database menu pages • Some libraries using proxy server logs • Metasearch system statistics
What About All That Data? • Collecting, consolidating and analyzing vendor data is time-consuming and difficult • Acquiring data from local systems provides consistency, also requires time and effort • Survey of electronic resource librarians indicates many do not have enough time • Libraries face difficult decisions about what methods are most practical and useful