1 / 30

Evaluating the Usage of Networked Electronic Resources

Explore the importance of evaluating networked electronic resources, with data-driven insights for decision-making, quality assessment, and strategic planning. Learn about cost management strategies, vendor data challenges, and effective log file analysis methodologies.

mdepasquale
Download Presentation

Evaluating the Usage of Networked Electronic Resources

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating the Usage of Networked Electronic Resources Terry Plum Assistant Dean, Simmons GSLIS Library Assessment Technological Educational Institution Thessaloniki, Greece June 15, 2005

  2. Why Evaluate Usage of Digital Resources? • Data driven decisions • Justification to patron groups • Budget justification to external funding sources. • Collection development decisions • Outputs for performance assessment • Assessment of service quality • Outcomes assessment • Strategic planning

  3. Cost • Association of Research Library members spend 215% more per serial unit cost in 2003 than they did in 1986. • The average expenditures for serial subscriptions for all serials (not just scholarly journals) in ARL academic libraries in 2003 are $5.46 million. • From 1984 to 2002, business and economics journals increased in price 423.7%, chemistry and physics journals increased 664%, and journals in medicine by 628.7%.

  4. Cost

  5. Vendor Supplied Data • Problems • Vendor reports do not provide sufficiently detailed information. • Vendor reports are inconsistent in their application of the definitions of variables. • Vendor reports are not commensurable between each other. • Some vendors do not report anything. • Practical solutions • Number of login (sessions) to networked electronic resources • Number of queries (searches) in networked electronic resources • Number of items requested in networked electronic resources. • Turnaways or exceed simultaneous use level. • Monthly • Level of effort, both by the vendor and by the library

  6. Vendor Supplied Data • Project COUNTER - Counting Online Usage of Networked Electronic Resources • http://www.projectcounter.org/ • ICOLC – International Coalition of Library Consortia • http://www.library.yale.edu/consortia/ • ISO – International Standards Organization • ISO 11620 Library Performance Indicators • http://www.iso.org/ • NISO – National Information Standards Organization • NISO Z39.7 Library Statistics • http://www.niso.org/

  7. ARL E-Metrics • As summarized by Blixrud and Kyrillidou (2003), asks for the following data from ARL libraries for measuring use of networked electronic resources, data which most libraries can only provide by collecting and analyzing vendor-supplied transaction data: • Number of login (sessions) to networked electronic resources • Number of queries (searches) in networked electronic resources • Number of items requested in networked electronic resources.

  8. Web Statistics • Web server log files • transaction - client/server • Technical representation of tasks performed by server • Log files (common) • IP address of requesting computer • Remote host: name of computer accessing the web server • Name of remote user (usually blank) • Login of remote user (usually blank) • Date

  9. Log Files • Referrer Log File • URL requested from or referring page • Agent Log File • Browser • Operating system • Name of spiders or robots used to probe your web site • IP address of requesting computer • Example • 127.0.0.1 - frank [10/Oct/2004:13:55:36 -0700] "GET /apache_pb.gif HTTP/1.0" 200 2326 "http://www.example.com/start.html" "Mozilla/4.08 [en] (Win98; I ;Nav)"

  10. Log files generated by library proxy servers • Proxy servers or passthrough (clickthrough) servers firewalls are based in some degree on an examination of headers • Can examine all requests that pass through it, so it is starting to make sense to put a proxy server in front of all library databases and ejounals. • Increasingly used as a data collection point for commensurable or comparable data.

  11. What do log files tell us? • Nothing if they are not analyzed. • What pages are requested on your site • IP addresses of computers making requests • Date and time of requests • Success of file transfer • Last page a requester visited before coming to your site • Search terms which led someone to your site.

  12. More log files • Logs and reports from locally implemented journal article services • Logs and reports from locally implemented digital library projects • ILS log files and reports • Becoming more interesting with metasearch engines • OPAC

  13. ILS Log Files • OPAC Search statistics • Number of searches attempted • By fields • Search terms • Null results • Print statistics such as items checked out, holds placed, etc. • Difficult to track usage of 856 links.

  14. Log Analysis Software • Analog • http://www.analog.cx/ • example • http://www.statslab.cam.ac.uk/webstats/stats.html • http-Analyze • http://www.netstore.de/Supply/http-analyze/ • WebTrends • http://www.netiq.com/webtrends/default.asp

  15. Log Analysis Software

  16. Issues with web surveys • Non-probability • Entertainment surveys • Self selected surveys • Volunteer panels • Probability • Intercept (every nth) • Surveys that obtain respondents from an e-mail request. • Mixed-mode surveys where one of the options is a Web survey. • Pre-recruited panels of a particular population as a probability sample

  17. Issues with web surveys • Research design • Coverage error • Unequal access to the Internet • Internet users are different than non-users • Response rate • Response representativeness • Random sampling and inference • Non-respondents

  18. Issues with web surveys • Mistrust of web surveys • Vendor data is census; web survey is a sample • Web surveys typically associated with user data, not usage data. • Even if usage, web surveys often collect predicted, intended or remembered usage, not actual usage • Web survey forms make appear differently in different browsers

  19. Networked electronic resources and services - assessment environment - • Resources are accessible from many different web pages and web servers • Bookmarks • The survey data must be collected and commensurable for all networked electronic resources. • Different authentication methods have to be accommodated, whether the institution used IP, password, referring URL, or an authentication and access gateway. • Remote usage has to be measured, regardless of the channel of communication, whether locally implemented proxy server, modem pool, or other institutional service.

  20. MINES strategy • A representative sampling plan, including sample size, is determined at the outset. Typically, there are 48 hours of surveying over 12 months at a medical library and 24 hours a year at a main library. • Random moment/web-based surveys are employed at each site. • Participation is usually mandatory, negating non-respondent bias, and is based on actual use in real-time. • Libraries with database-to-web gateways or proxy re-writers offer the most comprehensive networking solution for surveying all networked services users during survey periods.

  21. Web Survey Design Guidelines • Web survey design guidelines that MINES followed: • Presentation • Simple text for different browsers – no graphics • Different browsers render web pages differently • Few questions per screen or simply few questions • Easy to navigate • Short and plain • No scrolling • Clear and encouraging error or warning messages • Every question answered in a similar way - consistent • Radio buttons, drop downs • Introduction page or paragraph • Easy to read • Must see definitions of sponsored research. • Can present questions in response to answers – for example if sponsored research was chosen, could present another survey

  22. How to implement web surveys on library web sites • Because the point of use requirement, libraries that had a virtual gateway in library web architecture succeeded the best. • Rewriting proxy server • Database-to-web solutions • Serials Solutions • Interestingly openURL solutions are a gateway.

  23. Library web architecture

  24. Digital Libraries

  25. Digital Libraries

  26. Pre-print and post-print servers

  27. Pre-print and post-print servers

  28. Open Access Journals

  29. Library web architecture

  30. What is the future of assessment of networked electronic services • Library is responsible for many heterogeneous resources, not just subscriptions. • A library gateway could position the library to constantly assess usage of its resources. • This tool will just be one of many, along with LibQUAL+tm and other initiatives.

More Related