620 likes | 757 Views
Putting Statistics into Practice - Strategies for effective management. J Eric Davies & Claire Creaser. Options for measuring & managing. J Eric Davies. Outline. Mission, vision, aims objectives Range of data types Applications of data Methods of acquiring data General principles.
E N D
Putting Statistics into Practice -Strategies for effective management J Eric Davies & Claire Creaser
Options for measuring & managing J Eric Davies
Outline • Mission, vision, aims objectives • Range of data types • Applications of data • Methods of acquiring data • General principles
Measuring with meaning • Every organisation, no matter what its mission or scope needs three kinds of performance metrics - • to measure its success in mobilizing its resources, • its staff’s effectiveness on the job, and • its progress in fulfilling its mission. • McKinsey Quarterly. 2001 -2
Mission with meaning STRATEGIC FOCUS • WHAT - • Organisation? NOW / LATER • Service? NOW / LATER • Direction? NOW / LATER • Mission / Vision • aims / objectives
Criteria AIMS/OBJECTIVES Specific Measurable Acceptable Realistic Time-bound {SMART} Consistent Unambiguous Testing Empowering {CUTE}
How (im)possible is your mission? MISSION • Statement of purpose and functions – why service exists, what it does, who it serves VISION • Statement of desired future state – where service wants to be
How (im)possible is your mission? MISSION/VISION STATEMENTS ~ • MEANING • CREDIBILITY • ACCEPTABILITY • TESTABILITY
How (im)possible is your mission? MISSION/VISION STATEMENTS ~ Strathclyde University – Glasgow • A place of useful learning[1726] • The place of useful learning [2000]
Here’s one I prepared earlier! MISSION/VISION STATEMENTS ~ • Public Library - The mission of the Library is to serve as a cathedral of human knowledge— an accessible database of knowledge that serves as the community's memory—and as an information and knowledge safety net, while providing materials, programs, and services to the people of the community.
Digging deep for evidence What Kind Of Evidence [Information]? • Statistics and Performance Indicators • [Quantitative + Qualitative ] • Social Measures • [Soft Indicators] ‘DISTANCE TRAVELLED’
Who cares {or should do}? AUDIENCE: [stakeholders] • Funders • Managers/Staff • Users • Community • Vendors • Global
All kinds of measuring • Inputs SERVICE DOMAIN • Outputs • Outcomes USER RESPONSES • Impacts s e r v i c e
Social dimensions Examples of ‘Soft’ Indicators:- • Attitudinal • Personal • Practical • Key Work Skills DISTANCE TRAVELLED
Social dimensions • Personal development - individual self-confidence, self awareness, creativity, new skills and abilities. • Social cohesion - Impact on group/community identity • Community involvement and empowerment - • Health - people feeling better, happier etc.
How does the evidence add up? APPLICATIONS ~ SERVICES & PROJECTS • Policies • Strategies • Tactics • Processes and • Operations • Advocacy
What does the evidence answer? APPLICATIONS ~ SERVICES • How have we done? • How are we doing now? • How can we do better? • Where are we going? • How do we get there? • How are we making a difference? • How do we get the resources
What does the evidence answer? APPLICATIONS ~ PROJECTS • Did we achieve what we were seeking to achieve? • Did we do what we said we would do? • How did we do it? • What did we use? • What did we get out? • What worked and what didn’t work? • What could we do differently? • What can we apply continuously? • What difference did it make that we did it? • Who benefited?
Managing and measuring Framework for Performance Measurement:- • integration • user satisfaction • effectiveness (delivery) • efficiency • economy Follett Report – academic libraries
Managing and measuring Three E’s • Economy in acquisition of resources • Efficiency in the use of resources • Effectiveness in the achievement of objectives UK Treasury [1980’s] FMI Sizer [1980’s]
Comparing and changing • BENCHMARKING • Motorola + D.E.C. + Xerox • To make changes that lead to quantum and continuous improvements in products, processes and services that result in total customer satisfaction and competitive advantage
Comparing and changing BENCHMARKING – • Evaluate the level of performance of various services within an institution • Overall level of institution performance • Compare against published standards • Compare performance over time • Compare with other institutions
Finding out Gathering Evidence - • What do you need to know? • Where is the information? • Who has the information? • How will you get it? • How accurate is it / do you need it to be? • How will you interpret it? • How will you act on it? • How will you present it?
Gathering evidence Techniques/Tools/Options for Gathering Data:- • MIS / Transaction Logs • Databases / Publications • Surveys : questionnaire,telephone, interview • Focus Groups / Graffiti boards • Observation / Diaries / Logs • Press~Media Coverage
Gathering evidence TOOLS:- • What Outcomes, Dimensions, Performance to be measured? • reliable + valid • meaningful and precise
Gathering evidence Options for Gathering Data - • … if the only tool you have is a hammer, everything starts looking like a nail. F.W. Huibregston - Partner: McKinsey’s
Changing times; changing evidence UPDATING EVIDENCE • Service Evolution • New/Discontinued services - methods - technologies - clients • Diminishing Variance • Improvement - Gaming - Deception
Manager beware!! OVERDOING IT: If you know everything, you know nothing George Johnson: Fire in the Mind. [1996] … a world that never measures or counts is really beyond our control. The trouble is that we’re in danger of doing little else. David Boyle: RSA Lecture . [2001]
How much evidence? ... data is not information. Information is data endowed with relevance and purpose. A company must decide what information it needs to operate its affairs, otherwise it will drown in data Peter Drucker - Managing for the Future.
Making sense of measuring Sumsion’s Law of Statistical Dullness ~ In comparative statistics the great majority of results are inherently close to the average and consequently dull. {Sumsion} LIRN 2001 (79) p.3.
Evidence for yesterday Statistics, being essentially historical, can only provide information after the event. {Sumsion} Int. Encyclopedia of Lib. and Info. Sci.(1997)p.432
Measuring and managing Information is a precondition for identifying choices, reducing uncertainty about their implications and facilitating their implementation. Center for Transnational Corporations:- CTC Reporter 14 Winter 1983 p.34 -
Managing and measuring; comparing and changing • LISU: We’ve got the measure of information! • A skilled team of experienced Managers, Statisticians and Administrators all adding value to statistical data and providing authoritative and reliable information to support managers in culture, information and related environments.
Library and Information Statistics Unit lisu@lboro.ac.uk
Mission possibilities • Does it have • Meaning? • Does it actually mean anything? • Credibility • Do you believe it can be achieved? • Acceptability • Will all the stakeholders (funders, staff, users) ‘buy-in’ to this mission? • Testability • How would you demonstrate you are achieving your mission?
Library and Information Statistics Unit lisu@lboro.ac.uk
Statistics for the faint hearted Claire Creaser CStat
Introduction to statistics • Basics • What are statistics • Useful techniques • Sampling • Surveys and sample sizes • Questionnaire design • Analysis • Benchmarking • Presentation of results
What are statistics? • Numbers with context • 1,300 items issued last month • The average price paid for a CD is £12.50 • 25% of staff time is spent re-shelving books • Women borrow twice as many books on average as men • Serials cost three times as much as books • The average spend per user has increased less than general inflation over the last ten years
Where to start • What do you want to know? • Evidence of good management • Value for money • Advocacy • What data to collect? • What do you want to know? • Relevant • Useful • Current
Where do they come from? • Library management systems • Stock statistics, financial data, staff . . . . • Regular surveys • User opinions, condition of stock . . . . • Occasional surveys • Project evaluation
What do they look like? • Categorical • Gender; classmark; membership status • Ordinal • Stock condition; satisfaction ratings • Ratio or interval • Acquisitions; issues; expenditure
What can you do with them? • What do you want to know? • Descriptive statistics • Mean, range, distributions, proportions • graphical presentations • Inference from samples • Estimates, error levels • Advanced techniques • Correlation, regression, analysis of variance
Choosing the right technique • Keep it simple! • Categorical data • Proportions in each category • Comparisons • Ordinal data • Proportions in each category • Medians • Ratio data • Means
Sample surveys • Why sample? • Cost • Practicalities • Where to start? • Sampling frame • Sample design • Sample size
Types of sample • Simple random • Systematic • Stratification and clustering • Quota samples • Self-selected
How many? • Less than you think! • Depends on: • Level of detail • Desired margin of error • Expected response rate • Does not depend on population size • Unless small population • 400 will give accuracy of ± 5% • 1,000 for ± 3% • 2,500 for ± 2%
Questionnaire design • Self-completion or interview? • Clear, unambiguous questions • Clear, easy to follow layout • As short as possible • Number of questions • Number of pages • Tick boxes or short answers • Data entry issues
Sampling times • One period, or several? • Periodicity