1 / 35

Measurement-based Change in Libraries

Measurement-based Change in Libraries. Case Studies From an Academic Library Joan Stein, Head, Access Services Carnegie Mellon University Libraries. Why Do We Collect Data?. Imposed by parent institution &/or accrediting boards. Required by library’s senior administrators.

dore
Download Presentation

Measurement-based Change in Libraries

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measurement-based Change in Libraries Case Studies From an Academic Library Joan Stein, Head, Access Services Carnegie Mellon University Libraries

  2. Why Do We Collect Data? • Imposed by parent institution &/or accrediting boards. • Required by library’s senior administrators. • Required by organizational affiliation or political bodies. • Gathered automatically by library catalog, web logs; supplied by vendors, etc. • To monitor workload. • To document trends in collection & services activity. • To track progress toward strategic goals. • Document & understand user needs. • Track user satisfaction. • To get answers & information.

  3. Scooping Out the Ocean With a Sieve • Drowning in data • Available data keeps multiplying • Lack of data analysis skills: “data rich, insight poor” • Disconnect between collection of data and applying the results to create positive change • Research quality measures require education & extensive commitment of staff time and resources • Small steps may seem insignificant yet the large ones seem overwhelming • Lack of commitment from senior administrators • Too much turf, not enough pasture – think library-wide; be user-focused

  4. Practical Approaches: case studies • Conducting a data census • Operational measures • Analyzing workflow • Tracking & monitoring workload • Measuring user satisfaction with services & collections • Management data: • strategic plan • other visible library commitments (i.e. – advisory board recommendations) • In each case, the goal is to transform data into information!

  5. Library-wide Data Census • Conduct interviews of all library departments using needs assessment • Record all data kept, why, how, and for what purpose • Have department heads/administrators review results and provide missing information • Review existing standards & emerging measures • Determine what else needs to be collected, by whom, how, & for what purpose

  6. Needs Analysis • Current Data Gathering: • What data do you gather or use now? • How do you gather or use this data? • How frequently do you gather or use this data? • What problems do you encounter gathering or using this data? • For what purpose(s) do you gather & use this data? • How do you enter & access the data? • How do you manipulate the data? • Do you cross-correlate this data with other data? Is so, what? • How do you present or want to present your data? • How frequently do you generate reports on the data? • What other issues, concerns, needs do you have about data gathering and analysis?

  7. Census Sample .

  8. Putting Census Results to Use • Distribute & share widely • Encourage review of results • Eliminate duplicate & unnecessary measures • Create or revise your Management Information System to centralize data entry & facilitate discovery • Automate manual data where possible • Maintain manual efforts where needed • Provides focus for your measurement activity

  9. Measuring User Satisfaction • Understanding users’ needs & expectation • Designing services & building collections targeted to meet those needs • Results vary among institutions based on local factors

  10. Putting User Satisfaction Results to Work • Puts the focus on the users and off the process or the staff • Example: Carnegie Mellon University Libraries’ Resource Sharing Services

  11. Example: Resource Sharing • Focus groups with all user-groups • Focus group with resource sharing staff • Servqual study of random sample of service users • Analyze feedback from users & staff • Group feedback into meaningful categories • Look for patterns & trends

  12. Resource sharing example (cont.) • Compare users’ needs to existing service • Reengineering service to meet users’ stated needs • Requesting needed support • Re-measuring after changes are made to test results

  13. Value of Operational Measures • Quantitative data from various sources • Book-cart tracking sheets • Turnaround time data (reserves, ILL, etc.) • Taking baseline measures • Sharing and discussing data with relevant staff • Identifying the current & desired states • Brainstorming ways to get from the current to the desired state • Selecting your solution(s) • Changing procedures according to plans • Reallocating and/or requesting needed resources • Testing the outcome: did things actually improve?

  14. Example: Book Cart Tracking Slips Initial version of tracking slip:

  15. Revised Tracking Slip

  16. Turnaround time studies • Resource sharing example • Reserves example

  17. Work-flow Analysis • Creating flowcharts of distinct activities • Analyzing the documented process (ask why? 3 times) • Articulating the current & desired states (measurable) • Timing the process (where relevant) • Interviewing staff who perform the work • Identifying the bottle-necks

  18. Work-flow Analysis • Benchmarking with similar institutions • Brainstorming ways to go from current to desired state • Eliminating unnecessary steps • Eliminating re-work • Selecting & implementing solution • Test solution: did it eliminate the gap? (re-measure) • Revise flowchart & use as training aid

  19. Track and Monitor Workload • Counts of activities performed ( ILL requests, reserve items, holds,, etc.) • Keeping track of resources used (microforms re-shelved) • Creating graphic displays of data • Comparing activity over time (single instance or trend?)

  20. Example: Resource Sharing • 3 ILL offices; service level uneven due to workload distribution • Met to review workload figures • Borrowing activity concentrated primarily in one site • Other duties distributed evenly • Brainstormed ways to redistribute work more equitably to better meet users’ needs • Built team skills among staff performing the work • Implemented workload leveling

  21. Using Workload Analysis Results • Redistribute work among a group/team (workload leveling) • Use results of data collection & analysis to lobby for resources (human, financial, equipment, etc.) • Justify requests to reclassify existing staff positions, to request pay raises, etc

  22. Management Data • Library’s strategic plan is tied to plan of parent organization • Contains broad strategic goals • Contains action plan to reach goals • Advisory or accrediting board commitments (or other oversight group) • Includes recommendations for improvement • Requires goals set to meet recommendations

  23. Contributing to Management Data • Review strategic plan and/or other library commitments • Consider how your department or division can contribute to specific relevant strategic goals • Set departmental goals accordingly, in alignment with strategic plan • Monitor progress toward goals quarterly • Examples: • Circulation • Resource sharing

  24. Example: Circulation Goals • The relevant library goal from our Advisory Board Report is: The Libraries will move more aggressively into the electronic reserves arena to meet student and faculty demands. • Departmental goals: • Provide electronic access to appropriate Reserve readings, such as journal articles, book chapters, lecture notes, sample tests, etc. At least 30% of all these readings will be available in electronic format by 2003. • % of Reserve readings that can be accessed electronically from total photocopies on Reserve library-wide. • Total number of Reserve readings in print, total readings scanned and available as a link through the library catalog.

  25. Example: Circulation Goals Relevant library goal (from Advisory Board Report): • Make maximum use of available library automated systems • Automate and send all notices, over-dues, fines, bills, holds available, and recalls electronically and without staff intervention via the library management system and electronic mail. * 95% of all notices are generated and sent electronically * Percentage increase in the number of notices sent electronically • Complete the assimilation of the video collection into the library system * 100% of videos holdings are catalogued in Unicorn (our LMS) * Percentage increase in the number of videos catalogued* All videos are circulated through the library management system* All videos are acquired through the library management system

  26. Relevant Library Goal: Improve Convenience and Speed in the Interlibrary Loan System.Example: Resource Sharing Goals • Implement ILLiad, an interlibrary loan management system that will significantly facilitate the work of the department. • Improve ILL turnaround time by 20% by 2003 • Improve internal turnaround time for borrowing requests (those steps under our own control) by 50% by 2003. • Increase on-demand scanning and desktop delivery of materials. • Acquire and implement Ariel at each ILL location. • Percentage of total copies received that are delivered directly to the desktop. • 100% of ILL users will have direct electronic access to the status of their requests and to other ILL service performance information. • Percentage increase in the number of requests on systems that provide this type of information directly to users. • 100% of requests will provide service users with this information reliably. • Eliminate 100% of paper record-keeping.

  27. Tie Measures to Performance Evaluation • Design evaluation forms to include goals and performance measures for each goal. • Goals must be tied to library’s strategic objectives.

  28. Example: Employee Evaluation Form Section IV – Goals • Goals that should be maintained or improved to make the greatest contribution to the department’s goals, and developmental goals that the staff member has identified. Include goals that address performance at the “needs improvement” level. Complete the following sections as appropriate. Copy the template below to include additional goals. • Goal:   • Statement:   • Employee Action Plan with Targeted Completion Dates: • Outcome Measures (Ways to evaluate or measure the results): • 1st review – Date ______ Initials _______ ________ • Comments: • 2nd review – Date ______ Initials _______ ________   • Comments: • 3rd review – Date ______ Initials _______ ________ • Comments:

  29. Uses for Performance Information • to inform resource allocation • to evaluate library employees and library management • to determine the extent of the gap between the library’s goals and reality (as defined by the user) • to drive the reengineering of library processes, and • to benchmark against best practices

  30. Summary • Don’t reinvent the wheel • Keep it simple • Sample where possible • Keep measures focused (no “portmanteau” measures) • Automate whatever you can • Present data graphically: turn data into information • Trend-lines • Charts & graphs • Take results-oriented measures based on users’ need or established organizational goals • Consistency facilitates comparison • Turn results into actions

  31. Conclusions • Performance measurement is multi-faceted • Encompasses operational and management data • Most effective when participation is full • Departmental and individual contributions matter • Focus on users’ needs & expectations • Good off-the-shelf software exists to help

  32. Conclusions • Data collected for one purpose can be put to use in multiple ways. • Various data elements can be combined to create new insights. • Incremental change accumulates. Change based on small measures adds up to noticeable results. • Analyze existing data for potential to drive change. • Grass-roots efforts can succeed. • Push your own limits. • Learn the necessary skills. • Learn from your successes & your failures.

  33. Opportunities • Network with like-minded colleagues. • Publish your results. • Read broadly in this area of specialty. • Northumbria Lite at IFLA, Thursday, August 22nd. • 5th Northumbria International Conference on Performance Measurement in Libraries & Information Services, August, 2003 in Durham.

  34. Questions? • Examples or ideas of your own? • Questions?

More Related