1 / 41

The JISC Usage Statistics Portal

The JISC Usage Statistics Portal. Ross MacIntyre, Mimas The University of Manchester. 1996-98 SuperJournal 1998 Nesli 2000 UKSG w/shop, ICOLC, ARL, PALS, … 2001 ‘US Task Force’ 2002 COUNTER* 2003 *J&Db[R1] 2004 Evidence Base report: ‘Nesli2 Analysis of Usage Stats’

coty
Download Presentation

The JISC Usage Statistics Portal

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The JISC Usage Statistics Portal Ross MacIntyre, Mimas The University of Manchester

  2. 1996-98 SuperJournal 1998 Nesli 2000 UKSG w/shop, ICOLC, ARL, PALS, … 2001 ‘US Task Force’ 2002 COUNTER* 2003 *J&Db[R1] 2004 Evidence Base report: ‘Nesli2 Analysis of Usage Stats’ 2005 *J&Db[R2] 2006 *eB[R1], Key Perspectives report: Usage Stats Svc Feasibility Study, Mesur 2007 Content Complete report: JUSP Scoping Study, KE event – article level stats & COUNTER 2008 JISC ITT for JUSP Scoping Study 2 (prototype), *J&Db[R3], PIRUS 2009 JUSP Report, PIRUS2 2010 April JISC fund JUSP to service Personal timeline • Areas for Discussion • Different perspectives: • Publishers, Aggregators, Learning Institutions, Commercial Organisations, Product Vendors... • What do you want to monitor & why? • What is “usage”? • ‘Are you getting enough?’ • What are you supplying/gathering? • What do you do with it? • What is your ‘holy grail’?

  3. Context to assist and support libraries in the analysis of NESLi2 usage statistics and the management of their e-journals collections. • 20 NESLi2 e-journal deals • 132 HEIs taking up NESLi2 deals

  4. JISC Collections USAGE STATISTICS PORTAL SCOPING STUDY: PHASE ii TECHNCIAL DESIGN AND PROTYPING INVITATION TO TENDER Summary 1. This invitation to tender invites bidders to submit proposals to undertake to the technical design and prototyping for a Usage Statistics Portal 2. The deadline for proposals is 12:00 noon on Monday 14 July 2008. The work should start no later than end of July 2008. The work a detailed technical specification and design for the Usage Statistics Portal, and a scoping of costs required to bring it to production, and final report should be complete by 1st March 2009.

  5. 3 Project Partners Paul Needham Evidence Base = Pete Dalton & Angela Conyers Ross MacIntyre & Sean Dunne

  6. Aims of this JUSP study JR1 = Number of Successful Full-Text Article Requestsby Month and Journal JR1a = Number of Successful Full-Text Article Requestsby Month and Journalfor a Journal Archive JR2 = Turnawaysby Month and Journal JR3 = Number of Successful Item Requests and Turnawaysby Month, Journal and Page Type JR4 = Total Searches Runby Month and Service JR5=Number of Successful Full-Text Article Requestsby Year of Publication and Journal DB1 = Total Searches and Sessionsby Month and Database DB2 = Turnawaysby Month and Database DB3 = Total Searches and Sessionsby Month and Service • Refine user requirements for usage portal • Develop prototype portal • Obtain feedback from participating libraries on the portal developed • Specify what would be required to scale the portal into a full service. • To be based on COUNTER usage reports: • JR1 (total number of full-text article requests) • JR1A (requests from archives or backfiles)

  7. JUSP prototype 5 HE libraries University of Birmingham (A) Sarah Pearson Cranfield University (E) Simon Bevan University of Glasgow (A) Tony Kidd University of Liverpool (B) Terry Bucknell University of Westminster (C) Pat Barclay 3 NESLi2 publishers Elsevier, OUP & Springer 1Intermediary Publishing Technology (Ingenta)

  8. User Rqts • Initial (10) rqts turned in to specifications • 2007&8 JR1 & JR1A stats for 3 Publishers • Data obtained from the 5 Libraries (& 1 Publisher) • Libraries also provided relevant aggregator / gateway stats • All data parsed and loaded • Authenticated (u/p) access provided to each Library for own stats • Full access provided to JISC (Exec, Collections & JWG) • Rqts refined and further (10) rqts specified

  9. Technical – entity relationships Has Agreement Publisher Aggregator Publishes Supplies Has Agreement Has Agreement Journal ISSN Access 'hits' Institution Users

  10. The database v0.4 Suppliers Deals Platforms Journals Statistics Institutions Relationships key: One to many Many to many

  11. The database v0.4 Relationships key: One to many Many to many

  12. Technical – conversion from .xls

  13. Technical – conversion to .xml

  14. What does it demonstrate? • For libraries: • Single point of access to own usage statistics • Monthly figures presented in both calendar and academic years • Addition where relevant of gateway/aggregator statistics • Usage of current collections with backfiles removed • Assistance with SCONUL statistical return • Trend analysis, high use titles etc, publisher summaries etc • For JISC Collections • Access to usage statistics for all libraries and all NESLi2 deals

  15. Link to JUSP Prototype

  16. 1. Single point of access to all JR1 and JR1A usage statistics as currently downloaded individually from publisher websites • User informational text • From this page, you can download JR1 and JR1A (archive) reports. • You can select calendar year or academic year • Interface shows • Publisher – drop down list (Elsevier, Springer, OUP) • Report – drop down list (JR1 (all), JR1A (archive only) • Year – drop down list (2007 (Jan-Dec), 2007/08 (Aug-Jul), 2008 (Jan-Dec) • Submit request • Data processing notes for prototype • Show title by title JR1 and JR1A (archive) reports in form that can be downloaded by library

  17. 2. Addition of aggregator/gateway JR1 statistics where relevant • User informational text • To get a full picture of usage you may need to add usage statistics provided by other services such as Swetswise. This will depend on the publisher. • Select publisher and year to download JR1 reports with Ingenta, Swetswise, Ebsco EJS etc included where appropriate. • Hyperlink note text • If you use Ingenta, you will always have to add their JR1 usage statistics to those from the publisher to get a full record of use. For some publishers you will also have to add JR1 usage statistics from services like Swetswise • Interface shows • Publisher – drop down list (Elsevier, Springer, OUP) • Report – drop down list (JR1 (all)) • Year – drop down list (2007 (Jan-Dec), 2007/08 (Aug-Jul), 2008 (Jan-Dec)

  18. 3. Excluding usage of backfile collections • User informational text • JR1 reports include all usage. Some publishers also produce JR1A reports which give only usage of their archive or backfile collections. If you have access to these, you can download here reports that exclude backfile use and show only usage of current titles. • Interface shows • Publisher – drop down list (Elsevier, OUP, Springer (no JR1A report) • Year – drop down list (2007 (Jan-Dec), 2007/08 (Aug-Jul), 2008 (Jan-Dec) • Data processing notes for prototype • Titles in JR1 and JR1A matched by ISSN. JR1A usage subtracted from JR1. Use the total JR1 figures as in Screen 2

  19. 4. Summary tables to show trends over time, compare publishers etc. • User informational text • Use these tables to look at usage trends over time, and to compare usage of the various publisher deals to which you subscribe. • Interface shows • Publisher – drop down list (all, Elsevier, Springer, OUP) • Monthly, yearly – select whether you want to look at monthly or yearly stats • Year – calendar years all, 2007, 2008, academic year 2007/08 • Select a time period from & to • Data processing notes for prototype • Monthly - Use the monthly total JR1 figures as in Screen 2 • Yearly – Use the total JR1 figures for year • Academic year - use the total JR1 figures for year. Figures for SCONUL return.

  20. 5. Summary table to show use of aggregators • User informational text • Use this table to see how much of your total usage goes through Aggregators, e.g. Ingenta and Swetswise • Interface shows • Publisher – drop down list (Elsevier, Springer, OUP) • Year – calendar years all, 2007, 2008, academic year 2007/08 • Data processing notes for prototype • Give separate columns for publisher, Ingenta, Swets, and total and show JR1 usage in each. Show percentage use from each source.

  21. 6. Summary table to show use of backfiles • User informational text • Use this table to see how much of your total usage comes from backfiles • Interface shows • Publisher – drop down list (Elsevier, Springer, OUP) • Year – calendar years all, 2007, 2008, academic year 2007/08 • Data processing notes for prototype • Use JR1 total including aggregators. Use JR1A as in first screen. Show percentage of total JR1 usage that comes from JR1A

  22. 7. ‘Some more figures’ [sic] • User informational text • Find the average, median and maximum number of requests • Interface shows • Publisher – drop down list (all, Elsevier, Springer, OUP) • Year – calendar years all, 2007, 2008, academic year 2007/08 • Data processing notes for prototype • Use total JR1 for each publisher, i.e. including aggregators • Calculate average, median and maximum number of requests

  23. 8. What titles have the highest use? • User informational text • Find the titles which have the highest use • Interface shows • Publisher – drop down list (all, Elsevier, Springer, OUP) • Year – calendar years all, 2007, 2008, academic year 2007/08 • Display 20 titles with the highest usage (include publisher, title, issn, no. of requests, in descending order of no. of requests)

  24. 9. Tables and graphs • User informational text • See your monthly or annual usage over time as a chart • Interface shows • Publisher – drop down list (Elsevier, Springer, OUP) • Monthly, yearly – select whether you want to look at monthly or yearly stats • Year – calendar years all, 2007, 2008, academic year 2007/08 • Or • Select a time period from & to

  25. 10. Benchmarking • User informational text • Compare your usage with others in the same JISC band • Interface shows • Publisher – drop down list (all, Elsevier, Springer, OUP) • Year – calendar years all, 2007, 2008, academic year 2007/08 • Or • Select a time period from & to • Data processing notes for prototype • Use yearly JR1 totals (including aggregator) for each publisher. Highlight in some way the requesting library. Give total for all libraries in the JISC band and average. • Notes • Solely for the purposes of the prototype, the libraries are grouped into two bands A and B. These are NOT the true JISC Collections bands for these institutions.

  26. JISC Collections Benchmarking Survey – March 2010 Usage Statistics Portal: Benchmarking functionality 76 Institutions responded to our short survey in reference to the usage statistics portal (benchmarking functionality). Our findings are as detailed below. Question 1: How useful would it be for you to benchmark your institution’s journal usage for each individual NESLi2 publisher against that of other HE institutions? (76 responses) 38 / 76 (50%) = Very useful 36 / 76 (47.4%) = Somewhat useful 2 / 76 (2.6%) = Not useful

  27. Question 5. Regarding questions 2-4 above, please indicate which would be your preferred choice regarding benchmarking (74 responses) 37 / 74 (50%) = Named institution 23 / 74 (31.1%) = Listed anonymously (same JISC band) 14 / 74 (18.9%) = Average usage by institutions in the same JISC Band

  28. Questions 10: Regarding questions 7-9 above, which would be your preferred choice? (74 responses) 37 / 74 (50%) = Being anonymised within my JISC Band 30 / 74 (40.5%) = Other institutions being able to see my institution's name 7 / 74 (9.5%) = Being part of an average figure for the Band I am in

  29. Question 6. Is there any other benchmarking criteria you would like to see? • Same ‘mission group’ - Russell group / by type of institution / pre and post ’92 / 94 group • Select our own particular subset of named institutions • Similar size and structure • Usage, spend and budget for resources • Cost per download & cost per FTE - Student and Staff at department / subject level • SCONUL divisions (RLUK, old, new, collHE) and by area Scotland / Wales would also be useful • Trend over a period of years

  30. Question 11: Please add any additional comments you would like to make • If OK with the licence then comparing named institutions would be best/ Happy to be named if all institutions are named • Averages are not helpful unless accompanied by other institutional data. Anonymised usage figures would be more useful • Institutions within the same JISC Band can vary widely (e.g. do they have a medical school, do they still have a chemistry dept) so you really need the institution name to give any sort of useful benchmarking. • Pulling data like FTE and RAE would save us all from having to do that ourselves. • Would be useful for NESLi2, however the majority of our deals are outside NESLi2

  31. Q11 Continued • “The COUNTER code of practice release 3 provides for consortium level reports (with named institutions). • These reports are available to consortium managers and it seems to be the norm in consortia elsewhere to share this information freely within the consortium. • We are the consortium, remember! • Why this paranoia and coyness in the UK?”

  32. Requirements not addressed in prototype • Getting price information for journals. • Download list price of journals as supplied on the publishers website • Adding price information for journal lists. • See your annual usage with information on list price for each journal • ‘What titles are in the deal?’ • Adding deal information to journal lists. • Showing usage/non-usage of titles listed in the deal and titles not listed. • Summary table showing usage/non-usage of titles listed in the deal and not listed in the deal. • Summary table showing average and median use of titles listed in the deal and titles not listed. • Download area 1. Cost per request. • Download area 2. Usage of subscribed titles (tabular data) • Download area 3. Charts and graphs.

  33. Outstanding Issues • COUNTER R3 compliance as of Sept 2009 was low – now much higher • Upload of publisher price lists – lack of machine-readable sources (maybe ONIX Serials – SPS?) • Use subset of a Crossref journals list to populate the Journal and Supplier tables – nope • Subject categorisation of journals – nope • Identify which nil/low use journal not fully available within deal - nope

  34. Where next? 30th March 2010 - JISC accepted proposal to move from prototype to production Consortium: Evidence Base, Cranfield, Mimas and JISC Collections Accelerated development required – fully operational by end 2011 – all Libraries, all Publishers & Intermediaries.

  35. ‘To Do’ List • Production service • Scaling up, more libraries, more publishers • Implementing SUSHI • Further development of database • Further exploration of ‘added value’ services e.g. adding price, subject information, dealing with title changes, publisher transfers etc • Further assistance to libraries in analysing own usage • Authentication • Benchmarking • COUNTER for eBooks

  36. Final Observations • Open Source – available to institutions or other consortia • Complementary not in competition with licensed software offerings

  37. Q&A Ross.MacIntyre@Manchester.ac.uk * With Apologies to CBS TV show “How I Met Your Mother.”

More Related