480 likes | 578 Views
I Don’t Do Research . . . But. Steve Hiller Director, Assessment and Planning University of Washington Libraries hiller@u.washington.edu. I Do Use Research Methods as Part of Our Assessment and Planning Program for:. Understanding our user communities How they work
E N D
I Don’t Do Research . . . But Steve Hiller Director, Assessment and Planning University of Washington Libraries hiller@u.washington.edu
I Do Use Research Methods as Part of Our Assessment and Planning Program for: • Understanding our user communities • How they work • Their library and information needs • How we can make them successful • Organizational improvement • Improving organizational performance, effectiveness and efficiency • Delivering services and programs that make a difference
AssessmentMore than Numbers Library assessment is a structured process: • To learn about our communities • To respond to the needs of our users • To improve our programs and services • To support the goals of the communities
Why Assess? • Accountability and justification; demonstrating value • Improvement of services • Comparisons with others • Identification of changing patterns • Marketing and promotion • Opportunity to tell our own story • Using data, not assumptions, to make decisions • Assumicide!
What’s Driving the Agenda • Environmental Changes • Exploding growth in use and applications of technology • Increased customer expectations for services, including quality and responsiveness • “Competition” from other sources • Budgetary Constraints/Reductions • Justification for spending $$$ on libraries • Increasing competition for resources • Budget reductions and reallocations • Demonstrating Value • Accountability • How do we enable those in our community to succeed
Traditional Library Measures: Inputs Focus on how big/how much • Budget (staff, collections, operations) • Staff size • Collection size • Facilities • Other related infrastructure (hours, seats, computers) • Size of user communities and programs ARL “Investment Index” measures inputs related to expenditures and staff numbers
Traditional Library Measures: Outputs Focus on usage • Collections (print, electronic, ILL) • Reference services • Facilities (gate counts) • Instruction sessions • Discovery and retrieval • Other Web sessions May indicate if “inputs” are used, but doesn’t tell us what users were able to accomplish as a result.
The Challenge for Libraries • Traditional statistics are no longer sufficient • Emphasize inputs/outputs – how big and how many • Do not tell the library’s or customers’ story • May not align with organizational goals and plans • Do not measure service quality or library impact • Need better outcome measures that demonstrate difference the library makes and value it adds • To the individual, community and the organization • “No longer what makes a good library but how much good does the library do” (Peter Brophy)
Assessing and Demonstrating the Library Contribution to the Institutional Mission • The library’s contribution to learning and research • Student learning (accreditation driven) • Externally funded research and scholarship • Value of the library to the community • Information resources/collections • Library as place • Current services • Changes in library and information needs and use • Organizational performance and effectiveness • Collaborations
Good Assessment Starts Before You Begin . . . Some Questions to Ask • Define the question – What’s Important • What do you need to know, why, and when • How will you use the information/results • Where/how will you get the information • Methods used • Existing data • New data (where or who will you get it from) • How will you analyze the information • Who will act upon the findings
Four UsefulAssessment Assumptions • Your problem/issue is not as unique as you think • You have more data/information than you think • You need less data/information than you think • There are useful methods that are much simpler than you think Adapted from Douglas Hubbard, “How to Measure Anything” (2007)
Documenting Library Performance and Impact • Common library assessment methods • Surveys (satisfaction, needs, importance) • Usage and other library statistics • Qualitative information (interviews, focus groups, etc.) • Other statistical data • Institutional • Comparator (ARL, ACRL, peer groups, customized) • Government (NCES) • Collaborations • Value, Impact and Return on Investment • Lib-Value (IMLS grant to measure value and return on investment in academic libraries)
Criteria Utility Relevance/Importance Stakeholder needs Measurability Cost Timely Tools Usage data Surveys (local & standardized) Standardized tests Performance assessments Qualitative methods Rubrics Choosing the Right Assessment Method
Presenting Assessment Findings • Make sure data/results are: • Timely • Understandable • Usable • Identify important findings/key results • What’s important to know • What’s actionable • Present key/important results to: • Library administration/institutional administration • Library staff • Other libraries/interested parties/stakeholders
Success with Assessment • Use multiple assessment methods • Mine/repurpose existing data • Invest in staff training and resources • Focus on the customer and community • Learn from our users • Partner with other campus programs and institutions • Present assessment information so that it is understandable and usable
Association of Research Libraries (ARL) and Library Assessment ARL has played a major role in advancing assessment in academic libraries through: • ARL Statistics • New measures and standardized methods • LibQUAL+® user survey, MINES for libraries • Individual library consulting • Effective, Sustainable and Practical Assessment (ESP) • 42 libraries visited 2005-2010 to evaluate assessment needs and programs • Library Assessment Conference
ESP Insights • Uncertainty on how to establish and sustain assessment • Staff lack essential assessment/data analysis skills and knowledge • Lack of focus and assessment priorities; tenuous link to planning and decision making • Underutilization of campus assessment resources • More data collection than data utilization • Overreliance on surveys for user input • Organizational issues play a significant role in sustainable assessment
From Institutional Based Assessment to a Community of Practice THE NEED • Bring together library folks interested in assessment • Focus on effective and practical assessment • Establish an ongoing venue for presentation of library assessment issues, activities and results • Build a continuing education component (workshops) • Make it fun! AN ANSWER • Library Assessment Conference • Organized by ARL, U.Va and UW • Biennial conference first held in 2006
Using Assessment for Results at the University of Washington or How We Contribute to User Success • Assessment program established in 1991 • Focus on user needs • Information seeking behavior and use • Patterns of library use • Library contribution to learning and research • User satisfaction with services, collections, overall • Increasingly tied to strategic goals and priorities • Provides data to improve programs and services and to demonstrate the library contribution to user success
University of Washington LibrariesAssessment Methods Used • Large scale user surveys every 3 years since 1992 (“triennial survey”) • In-library use surveys every 3 years beginning 1993 • Focus groups/Interviews • User centered design • Observation (guided and non-obtrusive) • Usability • Usage statistics/data mining Information about assessment program available at: http://www.lib.washington.edu/assessment/
UW Libraries Triennial Survey • Started in 1992 with paper; web-based began in 2004 • Survey designed by library staff and asks about needs, importance, satisfaction, use patterns, and impact (comments valuable too) • Survey all faculty and a sample of students • Survey for each group is different and survey questions may change over time (although a core set remains the same over time and between groups) • Survey can help measure effectiveness of existing programs and provide direction for future ones Longest running cyclical survey in academic libraries
Strategic Priorities 2007-2010 • Expand digital and physical delivery services • Enhance library contributions to research productivity • Raise visibility and effectiveness of librarian liaisons • Inform UW researchers/authors about good scholarly communications practices • Strengthen library role in undergraduate learning • Reshape library spaces to enhance user experiences • Ensure content needed is accessible and deliverable • Implement new models of service
What We Did 2007-2009 • Began pull and scan service; harmonized ILL • Implemented UW WorldCat as primary access point • Articulated service expectations for librarian liaisons • Expanded scholarly communication efforts • Began revisioning process for undergrad library space • Brought in consultant on teaching and learning • Participated in ARL Library Scorecard Pilot (2009-) • 2009 - 12% budget reduction • Closed several branch libraries; cut hours; cut 29 positions in 2009 • Reduced collections budget; cut serial subscriptions
Libraries 2010 Triennial Survey Highlights • Record number of faculty and graduate student responses • Satisfaction ratings highest ever for faculty and grads; slightly lower for undergrads (at all 3 campuses) • Library contributions to teaching, learning, research and overall success rated very high by faculty/grad students • Substantial increase in use and satisfaction with library delivery services (ILL, pull and scan) • Online access to and delivery of scholarly information, especially journals, are driving research and scholarship
UW Libraries Triennial Survey Number of Respondents and Response Rate 1992-2010http://www.lib.washington.edu/assessment/
Library Services and Resources: Overall Importance to Work by Group (Scale of 1 “Not Important” to 5 “Very Important)
UW Libraries 2010 Triennial SurveyLibraries Contribution to:(Scale of 1 “Minor” to 5 “Major”)
Importance of Books & Journals by Academic Area(2010, Faculty, Scale of 1 “not important” to 5 “very important)
Subject Librarian Visibility and Satisfaction By Faculty College/School (Balanced Scorecard Metric)
Use Patterns: Frequency of In-Library Visits1998-2010 (Weekly or more often)
80% of the 400 comments from UWS Undergrads Dealt with Space and Hours • Open is one thing, space and available computers / tables with laptop plug-ins is whole other issue • More seating or computer areas, engineer a reduced noise level in Odegaard. • 1. More space between the computers 2. More quiet study areas 3. Spaces to eat, drink and take breaks • Suzzallo-Allen. Quiet, neat, clean, cool, beautiful, access to everything I need. Ode, on the other hand . . .
What People Do in Libraries by Group 20082008 In-Library Use Survey: 73% UG, 22% Grad, 5% Faculty
Other Relevant Data • During the past five years at UWS: • Total number weekly hours libraries open dropped 26% • Number of library seats dropped 3% • Enrolment increased by 6% • Gate counts increased by 6% or 250,000 more entrants
How UW Libraries Has Used AssessmentA Few Examples • Extend hours in Undergraduate Library (24/5.5) • Create more diversified student learning spaces • Enhance usability of discovery tools and website • Provide standardized service training for all staff • Review and restructure librarian liaison program • Consolidate and merge branch libraries • Change/reallocate collections budget • Change/reallocate staffing • Support budget requests to University
Integrated Organizational Performance Model The Balanced Scorecard • A model for measuring organizational performance developed in the 1990’s by Kaplan and Norton that: • Helps identify the important statistics • Helps ensure a proper balance • Organizes multiple statistics into an intelligible framework • Clarifies and communicates the organization’s vision • Provides a structured metrics framework for aligning assessment with strategic priorities & evaluating progress • ARL Library Scorecard Pilot in 2009-10 with 4 libraries • Johns Hopkins, McMaster, Virginia, Washington
Goals of the ARL Pilot • Evaluate the Balanced Scorecard a suitable performance model for academic research libraries • Value as structured process to better integrate and strengthen strategy, planning and assessment • Encourage cross-library collaboration • Review objectives and measures for commonalities between libraries
Closing the Loop: Success with Assessment • Assess what is important • Keep expectations reasonable and achievable • Use multiple assessment methods; corroborate • Mine/repurpose existing data • Focus on users; how they work, find & use information • Use the data to improve and add customer value • Keep staff, customers and stakeholders involved and informed
Eye to the Future Measuring performance is an exercise in measuring the past. It is the use of that data to plan an improved future that is all important.Peter Brophy • Data trends can inform the future • Strategic planning can frame the future • Organizational performance models can align ongoing operations with future aspirations • Understanding how customers work, how that work is changing, and ways we can make customers and institutions success are key to the future of libraries
In Conclusion Can You Answer These Questions? • What do we know about our communities and customers to provide services and resources to make them successful? • How do we measure the effectiveness of our services, programs and resources from the customer perspective? • What do our stakeholders need to know in order to provide the resources needed for a successful library?