360 likes | 513 Views
Assessing Return on Investment for E-Resources: A Cross-Institutional Analysis of Cost-Per-Use Data. Patrick L. Carr North Carolina Serials Conference Chapel Hill. March 10, 2011. “Print was simpler.”. This has been replaced by….
E N D
Assessing Return on Investment for E-Resources: A Cross-Institutional Analysis of Cost-Per-Use Data Patrick L. Carr North Carolina Serials Conference Chapel Hill. March 10, 2011
This has been replaced by… From Appendix B of the 2004 report of the Digital Library Federation’s Electronic Resources Management Initiative.
This Has been replaced by… From Appendix B of the 2004 report of the Digital Library Federation’s Electronic Resources Management Initiative.
Indeed:This Has been replaced by… “Print was simpler.” From Appendix B of the 2004 report of the Digital Library Federation’s Electronic Resources Management Initiative.
Indeed:This Has been replaced by… “Print was simpler.” Not when it comes to use-based evaluations. From Appendix B of the 2004 report of the Digital Library Federation’s Electronic Resources Management Initiative.
Thomas E. Nisonger, Management of Serials in Libraries (Englewood, Colorado: Libraries Unlimited, 1998): 160-65.
We have far superior tools to assess use of e-resources. Thomas E. Nisonger, Management of Serials in Libraries (Englewood, Colorado: Libraries Unlimited, 1998): 160-65.
Cost-Per-Use: Should always be considered in the context of qualitative measures of ROI.
Cost-Per-Use:A powerful tool for assessing return on investment.
Cost-Per-Use:A powerful tool for assessing return on investment.But are we using CPU data to its fullest potential?
Chuck Hamaker’s idea: A cross-institutional analysis of CPU data &
Caveat: Data is generally from 2009 (complete 2010 data wasn’t yet available)
Caveat: Data is generally from 2009 (complete 2010 data wasn’t yet available) Caveat: The coverage ranges for cost and use data didn’t always overlap completely.
Caveat: Data is generally from 2009 (complete 2010 data wasn’t yet available) Caveat: The coverage ranges for cost and use data didn’t always overlap completely. Caveat: Institution-by-institution access sometimes differed .
Caveat: Not all sources of use data were COUNTER compliant. Caveat: Data is generally from 2009 (complete 2010 data wasn’t yet available) Caveat: The coverage ranges for cost and use data didn’t always overlap completely. Caveat: Institution-by-institution access sometimes differed.
We can’t really use this study’s results to make sweeping conclusions. Caveat: Not all sources of use data were COUNTER compliant. Caveat: Data is generally from 2009 (complete 2010 data wasn’t yet available) Caveat: The coverage ranges for cost and use data didn’t always overlap completely. Caveat: Institution-by-institution access sometimes differed.
Five Categories of Resources • Publisher journal packages • Commercial publishers • Society publishers • University presses • Full-text aggregators • Site licenses to journals • Indexing & abstracting databases • Other stuff
What does a cross-institutional CPU analysis actually tell us? Three Questions
What does a cross-institutional CPU analysis actually tell us? • How can we use what it tells us? Three Questions
What does a cross-institutional CPU analysis actually tell us? • How can we use what it tells us? • Where do we go from here? Three Questions
Questions/Comments Patrick L. Carr Head of Electronic & Continuing Resource Acquisitions Joyner Library East Carolina University Greenville, North Carolina 27858 email: carrp@ecu.edu phone: 252-328-2266