440 likes | 548 Views
New Tricks for Old Statistics. Margaret Fain Head of Public Services Jennifer Hughes Head of Access Services Coastal Carolina University, Conway, SC ACRL 14 th National Conference Pushing the Edge: Explore, Engage, Extend March 12-15, 2009 -Seattle, WA. Introduction. Why this workshop?
E N D
New Tricks for Old Statistics Margaret Fain Head of Public Services Jennifer Hughes Head of Access Services Coastal Carolina University, Conway, SC ACRL 14th National Conference Pushing the Edge: Explore, Engage, Extend March 12-15, 2009 -Seattle, WA
Introduction • Why this workshop? • Audience: librarians new to using statistics for meaningful assessment. • Goal: to improve use of statistics for strategic planning and assessment.
Learning Outcomes • Identify new ways of using library statistics in order to achieve short and long term goals. • Learn to incorporate statistics as benchmarks into strategic planning and assessment to show administrators how the library is achieving its mission on campus. • Identify and learn to use NCES statistics and other public data for peer comparisons.
Assumptions • Interest in or need to use statistics for assessment /strategic planning /budgeting. • Need for inexpensive but effective statistical gathering methods. • Lack of time/money/personnel for statistics. • No results = No budget.
Assessment Systematic collection, review, and use of information on programs undertaken for the purpose of improving student learning and development. • Ongoing • Sustainable
Direct and Indirect Measures: • Information Literacy tests • LibQual+ • Surveys • Users • Staff • Focus Groups • Statistics
Accreditation and Assessment Assessment may be characterized as the third element of a four-step planning-assessment cycle: 1. Defining clearly articulated institutional and unit-level goals; 2. Implementing strategies to achieve those goals; 3. Assessing achievement of those goals; and 4. Using the results of those assessments to improve programs and services and inform planning and resource allocation decisions. (Middle States)
Accrediting Bodies MSCHE http://www.msche.org/publications/Assessment_Expectations051222081842.pdf NEASC http://cihe.neasc.org/standards_policies/standards/standards_html_version NCACS http://www.ncahlc.org/index.php?option=com_content&task=view&id=37&Itemid=116 NWCCU http://www.nwccu.org/Standards%20and%20Policies/Accreditation%20Standards/Accreditation%20Standards.htm SACS http://www.sacscoc.org/pdf/2008PrinciplesofAccreditation.pdf WACS http://www.wascsenior.org/findit/files/forms/Handbook_of_Accreditation___July_2008.pdf
Assessment and Accountability • Focus on student learning outcomes. • Focus on improvements in services and programs (SACS). • Focus on demonstrating service or resource is effective, adequate or appropriate.
Sample from SACS Core statement: The institution, through ownership or formal arrangements or agreements, provides and supports student and faculty access and user privileges to adequate library collections as well as to other learning and information resources consistent with the degrees offered. These collections and resources are sufficient to support all educational, research, and public service programs. Comprehensive statement: The institution provides facilities, services, and other learning/informational resources that are appropriate to support its teaching, research, and service mission.
SACS Supporting Documents Data concerning physical facilities for learning resources [devoted to learning and instructional resources]. Data concerning collections and electronic access at the institution and arrangements with other institutions or organizations. [Lists of instructional resources and services]. Data concerning other information resources available to students at their learning locations.
SACS Evidence Core: Description of the adequacy of learning resources for all credit coursework and programs that the institution offers. Comprehensive: Evidence that resources are appropriate and adequate.
Traditional Statistics Collected on Use of: • Services • Collections • Resources • Buildings
Exercise What are the statistics currently collected in your library or area of responsibility? • Audience responses
Statistics in a New Light Traditional • Quantitative data • No context • Demonstrates use but not impact • No measures of quality
Statistics in a New Light Assessment oriented • Quantitative AND Qualitative data • Contextual. • Shows impact of library services on users. • Demonstrates library’s contribution to own and institutional goals and objectives.
Courtesy Notices • Quantitative data • number of items overdue per semester • number of items renewed per semester • fine money owed per semester • fine money collected per semester
Courtesy Notices • Qualitative data: • Student survey responses and student and faculty comments indicated general unhappiness with current system • Staff requests to reduce amount of time spent renewing overdue materials (very labor intensive) • Objective: to reduce number of items overdue and reduce fines paid by students and improve overall satisfaction with overdue process.
Courtesy Notices • Changes Made • Implemented email courtesy notices for all patrons Traditional: We are Done New: We have to follow-up and show that it did make a difference.
Courtesy Notices : Analysis of Results • Quantitative data showed • decrease in overdue items • increase in items renewed • decrease in fines owed and fines collected • decrease in number of fines forgiven/reduced. • Qualitative data showed decrease in number of complaints about overdues and fines by all users.
What Are “New” Statistics? • Statistics with value added • Statistics combined with other data sources to “prove” results. • Statistics that show how patrons are using or not using resources/services.
Creative Applications of “New” Statistics • Expand hours with new staff • Building use stats • Circulation stats • Student surveys (3) re: hours of operation • Hours of operation • Current staff working hours
Group Exercise Brainstorm “new” ways of using or combining traditional statistics to demonstrate that resources and services are appropriate and adequate.
Break 20 minutes
New Ways to Analyze and Collect Data • Excel • NCES
Excel Advanced Instructions @ http://www.coastal.edu/library/presentations/index.html
NCES for Peer Comparisons • NCES Library Statistics Program http://nces.ed.gov/surveys/libraries/
Quick and Dirty Data Collection • Quick surveys (SNAP, Survey Monkey, Zoomerang) • Automated reports • Samples (data snapshots)
Closing the Loop • Strategic Planning • Short term / Long Term Objectives • Determine data needs up front • What you currently collect / have. • What will be needed to demonstrate objective is accomplished. • Timeframe for collection
Closing the Loop • How will results be used? • How will you document changes made to show improvement? • How will you revise data collection for the next year?
Action Plans Using traditional statistics in new ways, draft a sample action plan which uses the “new” statistics as benchmarks to provide outcomes and indicators of changes made.
Template to Downloadhttp://www.coastal.edu/library/presentations/index.htm Action Plan Objective to be accomplished: Data to be collected: Sources of additional data: Benchmark/Metric: Expected outcomes: Expected indicators of changes made:
Literature Blake, J. and S. Schleper. “From Data to Decisions: Using Surveys and Statistics to Make Collection Management Decisions." Library Collections, Acquisitions, & Technical Services 28 (2004): 460-464. Cheng, R., S. Bischof, and A. Nathanson. “Data Collection for User-oriented Library Services: Wesleyan University Library’s Experience.” OCLC Systems & Services 18.4 (2002): 195-204. Dilevko, J. “Inferential Statistics and Librarianship.” Library and Information Science Research 29 (2007): 209-229. Intner, S. “Making Your Collections Work for You: Collection Evaluation Myths & Realities.” Library Collections, Acquisitions, & Technical Services 27 (2003):339-350. Luzius, J. “A Look at Circulation Statistics.” Journal of Access Services 2.4 (2004): 15- 22. Welch, J. “Who says we’re not busy? Library web page usage as a measure of public service activity.” Reference Services Review 33.4 (2005): 371-379.
Additional Resources • ARL: New Measures and Assessment Initiatives • http://www.arl.org/stats/initiatives/ • Library Research Service • http://www.lrs.org/index.php • IPEDS • http://nces.ed.gov/IPEDS/