230 likes | 422 Views
Charles R. McClure, Ph.D. (cmcclure@lis.fsu.edu) Wonsik “Jeff” Shim, Ph.D John Carlo Bertot, Ph.D. Information Institute/School of Information Studies Florida State University (www.ii.fsu.edu). The E-metrics Project: Update and Project Completion. ARL E-Metrics Project
E N D
Charles R. McClure, Ph.D. (cmcclure@lis.fsu.edu) Wonsik “Jeff” Shim, Ph.D John Carlo Bertot, Ph.D. Information Institute/School of Information Studies Florida State University (www.ii.fsu.edu) The E-metrics Project: Update and Project Completion • ARL E-Metrics Project • Data Collection Manual • Institutional Outcomes Proposal • Training and Dissemination • Concluding Activities • Issues and Challenges
Project Co-Chairs Rush Miller, Univ. of Pittsburgh Sherrie Schmidt, Arizona State Univ. Project Logistics (ARL) Duane Webster Martha Kyrillidou Project Directors (Florida State Univ.) Charles R. McClure, Wonsik "Jeff" Shim, and John Carlo Bertot ARL E-Metrics Project
Alberta Arizona State Auburn Chicago Connecticut Cornell Illinois-Chicago Manitoba Maryland U. Mass Nebraska Notre Dame Pennsylvania Penn State Pittsburgh Purdue Texas A&M S. California Virginia Tech W. Ontario Wisconsin Yale LC NYPL Project Participants (24 ARL Libraries)
Phase One (May-October 2000) Knowledge Inventory Methods: Surveys, Site Visits, Document Analysis, Phase Two (November 2000-June 2001) Developing Measures and Testing Methods: Surveys, Site Visits, Focus Group Interviews, Document Analysis Phase Three (July 2001-December 2001) Developing Manuals and Tools ARL E-Metrics Project
To demonstrate the use of digital collections in order to make a case for continued collection development and support; To enable libraries to compete for resources with other organizations and/or depts by documenting the range, extent, and impact of library-provided networked resources; To provide a decision-making framework for library staff, managers, and administrators to determine resource allocation strategies and meet other management needs; Need for network statistics and measures
To allow libraries to compare themselves to others re: collection and service development, costs, provision of services, and use; To enable libraries to measure and track internal changes to library library operations as well as uses and users of library resources and services; and To provide a means through which to measure aspects of effectiveness, efficiency, and quality of library services and resources in the networked environment. Need for network statistics and measures (continued)
Libraries are collecting some data, often statistics related to patron accessible resources and cost of electronic databases. For use statistics, libraries depend almost solely on vendor reports. Libraries have little information about the users of networked services. Data most frequently used for immediate decision makings: licensing contracts, budget requests Less frequently used to paint a bigger picture of information usage patterns. Findings from Phase I
Librarians point to problems associated with data collection, especially vendor statistics. Lack of consistent, comparable and detailed data, problems with interpreting and summarizing data, lack of technology and personnel support, inability to link data to peer comparison, quality of service, and outcomes. Little organizational infrastructure to support data collection, analysis, and reporting. Findings from Phase I (continued)
November 2000 - June 2001 Participants had significant and on-going input into the development of the proposed statistics and measures. Data collection instruction manual was also developed to aid field testing. Field testing conducted during May 2001. A separate field testing of vendor statistics conducted during June 2001. Manual published October 2001. Development of Statistics and Measures for Field Testing
Complete report and manual at: http://www.arl.org/stats/newmeas/ emetrics/index.html Measures and Statistics for Research Library Networked Services: Procedures and Issues (Washington DC: ARL)
Statistics/Measures Described in the Manual Patron Accessible Electronic Resources R1 Number of electronic full-text journals R2 Number of electronic reference sources R3 Number of electronic books Use of Networked Resources and Services U1 Number of electronic reference transactions U2*Number of logins (sessions) to electronic databases U3*Number of queries (searches) in electronic databases U4* Items requested in electronic database U5 Virtual visits to library’s website and catalog (* Tested in the vendor statistics field testing) Expenditures for Networked Resources and Infrastructure C1 Cost of electronic full-text journals C2 Cost of electronic reference sources C3 Cost of electronic books C4 Library expenditures for bib. utilities, networks, and consortia C5 External expenditures for bib. utilities, networks, and consortia
Library Digitization Activities D1 Size of library digital collection D2 Use of library digital collection D3 Cost of digital collection construction and management Performance Measures P1 % of electronic reference transactions of total reference P2 % of virtual library visits of all library visits P3 % of electronic books to all monographs Total of 19 Statistics and Measures Described with data collection procedures. Statistics/Measures Described in the Manual
Institutional Outcomes • Definition: A clearly identified result or end product that occurs as a consequence of individual or combined activities from units at the Institution. These outcomes are a preferred or desired state and clarify specific expectations of what should be products from the institution.
Institutional Outcomes: Next Steps • Refine the existing model • Integrate model into library planning and evaluation • Encourage libraries to design and develop the outcomes assessment process at their institution • Submit research proposal on outcomes assessment and the role of academic libraries in that assessment process.
Develop Training Modules • Re-work the statistics and measures manual to be a “stand alone” product • Develop 2-3 instructional modules that support the manual to help train the trainers - Importance of evaluation, statistics and measurement - Explanation and how to use the statistics and measures • Training workshop for member libraries to be held January 19, ALA New Orleans
Improving Organizational Resources to Support Evaluation Efforts • Need for organizational infrastructure of staff knowledge, staff time, resources, and organization for evaluation • On-going program of evaluation and assessment -- not a one-shot • Continue working directly with vendors and ICOLC • Organizing and coordinating efforts among vendors, ARL, ICOLC, NCLIS, and others.
Issues and Challenges • Incomplete, non-comparable data from multiple sources in the networked environment make it difficult to collect and use data. But we must start now. The statistics and measures proposed in the E-Metrics project will evolve. • Library community needs to work with database vendors, standards organizations on data definitions and reporting procedures. International collaboration should also continue.
Issues and Challenges (continued) • The degree to which libraries will be able to collect these data and report them is linked to the amount of resources they can commit. • Libraries have different operating environments, different data needs/expectations. They need to think about which statistics and measures would be best to use, strategically and politically, in their own settings.
Issues and Challenges (continued) • Conducting research for 24 ARL libraries with differing situations and objectives. • Coordinating and extending next steps for the New Measures initiative -- priorities and strategies? • Linking assessment approaches to work together and not as stand alone strategies.
Anticipated service provider impacts of services • User perceived impacts of library services • Question: • What difference did the services make? Outcomes Inputs Outputs • Invested Resources • Measured through library budget, personnel, other • Captured through library statistics • Question: • What resources are we putting into the services? • Invested Resources Results • Measured through library MIS, organizational charts, other • Captured through library statistics • Question: • What services do we get out of the invested resources? • Anticipated service provider expectations of the quality of library services • User expectations (anticipated and perceived) of library services • Gaps between user and service provider expectations and perceptions • Question: • How good are the services? Service Quality Putting the PiecesTogether
Remember: Multiple Approaches are Available to Assess Networked Services • The inputs-outputs approach (as used in this project) for statistics and performance measures - Quantitative - Qualitative • LibQual • Service Quality • Quality Standards • Educational and Institutional Outcomes • And others...
Importance of the E-metrics Project • 19 new statistics and measures have been developed and field tested to describe use, users, and uses of networked resources • ARL members have a better understanding of the resources and organizational concerns needed to conduct effective evaluation efforts • We have better knowledge of the role of academic libraries in Institutional Outcomes • The project provides a springboard for additional research in measurement.
Comments and Questions? Charles R. McClure Francis Eppes Professor, and Director, Information Institute Florida State University cmcclure@lis.fsu.edu http://slis-two.lis.fsu.edu/~cmcclure/ http://www.arl.org/stats/newmeas/emetrics/index.html