210 likes | 368 Views
Assessment Module. IFD MIS 2004. Culture of Assessment. An organizational environment in which decisions are based on facts, research and analysis.Services are planned and delivered to maximize positive customer outcomes.? ? A Culture of Assessment is an integral part of the process of change an
E N D
1. Assessment What, Why, and How?
UM Libraries Learning Curriculum
Irma Dillon, ifd@umd.edu
Manager, Information Management Systems
August 2, 2005 Good Morning!
This is an introduction to assessment. I am going to define assessment, define and give examples of assessment types. Good Morning!
This is an introduction to assessment. I am going to define assessment, define and give examples of assessment types.
2. Assessment Module. IFD MIS 2004 Culture of Assessment
An organizational environment in which decisions are based on facts, research and analysis.
Services are planned and delivered to maximize positive customer outcomes.
A Culture of Assessment is an integral part of the process of change and the creation of a user-centered library. A culture of assessment is a component of a learning and team based organization. It is important to learn if what we are doing meets the needs of our user community. In order to learn this it is necessary to assess and evaluate our efforts. Decisions should be based on facts research and analysis. We will be talking about ways to gather this information. A culture of assessment is a component of a learning and team based organization. It is important to learn if what we are doing meets the needs of our user community. In order to learn this it is necessary to assess and evaluate our efforts. Decisions should be based on facts research and analysis. We will be talking about ways to gather this information.
3. Assessment Module. IFD MIS 2004 Definitions Evaluation - the process of determining the value or worth of a program, course, or other initiative, the ultimate goal of evaluation is to make decisions about adopting, rejecting, or revising the current program or innovation.
Assessment - encompasses methods for measuring or testing performance on a set of competencies Evaluation should not be confused with assessment. Evaluation is the more inclusive term, often making use of assessment data in addition to many other data sources.Evaluation should not be confused with assessment. Evaluation is the more inclusive term, often making use of assessment data in addition to many other data sources.
4. Assessment Module. IFD MIS 2004 Overview of the Assessment Process
Know the goals of overall library program
Determine the goals of the assessment
Determine assessment plan long and short term goals
Determine population(s) to assess
Determine methods of assessment/evaluation
Design the assessment Long term goals may include looking at data over a period of time longitudinal data such as how the growth of collections, use of databases.
Short term goals may include
Population undergraduates, graudates, faculty
Methods survey, focus groups
Long term goals may include looking at data over a period of time longitudinal data such as how the growth of collections, use of databases.
Short term goals may include
Population undergraduates, graudates, faculty
Methods survey, focus groups
5. Assessment Module. IFD MIS 2004 Overview of the Assessment Process, cont. Do it
Analyze the data gathered
Understand and determine how to use the results
Incorporate results into current and future programs and plans.
What new questions arise start the process again
6. Assessment Module. IFD MIS 2004 Performance Measures Outputs - quantifies the work done (i.e., number of books circulated, number of reference questions answered, etc.).
Inputs - the resources the library has available and can contribute to provide programs (i.e., money, space, collection, equipment, staff, etc ) Performance measures are a gauge used to assess the performance of a process or function of the organization. Performance measures provide a series of indicators, expressed in qualitative, quantitative or other tangible terms, that indicate whether current performance is reasonable and cost effective. They show the progress of an action against the evaluation plan and indicate to what extent goals have been reached.
Examples of output measures
level of library use
Number of subject specialists establishing home pagesNumber of students taught each yearNumber of librarian/faculty partnerships establishedNumber of process improvement charges accomplishedNumber of electronic knowledge products developed
Examples of input measures
Number of customers requesting serviceNumber of students enrolled in information literacy programNumber of librarian consultation hours availableNumber of ILL requests processed
Volumes added to the collection
Personnel per capitaPerformance measures are a gauge used to assess the performance of a process or function of the organization. Performance measures provide a series of indicators, expressed in qualitative, quantitative or other tangible terms, that indicate whether current performance is reasonable and cost effective. They show the progress of an action against the evaluation plan and indicate to what extent goals have been reached.
Examples of output measures
level of library use
Number of subject specialists establishing home pagesNumber of students taught each yearNumber of librarian/faculty partnerships establishedNumber of process improvement charges accomplishedNumber of electronic knowledge products developed
Examples of input measures
Number of customers requesting serviceNumber of students enrolled in information literacy programNumber of librarian consultation hours availableNumber of ILL requests processed
Volumes added to the collection
Personnel per capita
7. Assessment Module. IFD MIS 2004 Performance Measures, cont. Outcomes- Sometimes referred to as "indicators of impact" or " indicators," outcomes are the benefits to people as a result of programs and services: specifically, achievements or changes in skill, knowledge, attitude, behavior, condition, or life status for program participants.
Outcomes assessment focuses on measures of achievement identified as measurable through the librarys goals.
Examples of Outcomes
Percent increase of students with enhanced information literacy skills measured by pre-tests and post-testsImproved access to informationPercent decrease of repeat customer complaintsPercent increase of key disciplines with new or improved access tools
USES FOR PERFORMANCE MEASURES
There are many reasons to use performance measures. TO ENHANCE THE QUALITY OF SERVICES. Performance measures inform the staff of customer needs and levels of satisfaction. They make it possible to identify actions to improve quality and reduce costs.TO IMPROVE MANAGEMENT PRACTICES. Performance measures create an unbiased way to rate performance. Once measures are agreed upon, teams and individual staff can be freed to manage their own activities to achieve the desired result. This motivates employees, reduces the tendency to micro-manage, and makes everyone more accountable.TO EVALUATE PROJECTS. Performance measures provide the basis to assess whether a project is working. They provide quantitative fact-based information for policy and project decision making. Performance measures may be reviewed by the Auditor General when conducting performance audits.TO AID IN BUDGET DEVELOPMENT AND REVIEW. Performance measures allow for more accurate assessment of resources needed to support activities. They also help identify what level of product or service will be provided for the amount of funding available.Examples of Outcomes
Percent increase of students with enhanced information literacy skills measured by pre-tests and post-testsImproved access to informationPercent decrease of repeat customer complaintsPercent increase of key disciplines with new or improved access tools
USES FOR PERFORMANCE MEASURES
There are many reasons to use performance measures. TO ENHANCE THE QUALITY OF SERVICES. Performance measures inform the staff of customer needs and levels of satisfaction. They make it possible to identify actions to improve quality and reduce costs.
8. Assessment Module. IFD MIS 2004 CRITERIA FOR GOOD PERFORMANCE MEASURES: MEANINGFUL -- significant and directly related to the mission and goal.
RESPONSIBILITY LINKED -- matched to an organizational unit responsible for achieving the measure.
ORGANIZATIONALLY ACCEPTABLE -- valued by those within the organization.
CUSTOMER FOCUSED -- reflect the point of view of the customers and stakeholders.
COMPREHENSIVE -- include all key aspects of the Library performance.
BALANCED -- include several types of measures, i.e., outcome, efficiency, and quality measures
9. Assessment Module. IFD MIS 2004 CRITERIA FOR GOOD PERFORMANCE MEASURES, cont:
TIMELY -- use and report data in a reasonable time-frame.
CREDIBLE -- based on accurate and reliable data.
COST EFFECTIVE -- based upon acceptable data collection and processing costs.
COMPATIBLE -- integrated with existing financial and operational systems.
COMPARABLE -- useful for making comparisons with other data over time.
SIMPLE -- easy to calculate and interpret.
10. Assessment Module. IFD MIS 2004 Why Evaluation and Assessment at UM Libraries? Information environments are changing rapidly
Widespread availability of other information sources
Libraries need to adapt to changing user needs
Library funding stagnant or declining
Need to ensure that our programs add value to user work Technology continues to improve and to allow people to gather information from many sources easily.
We need to stay on top of user needs and adapt as much as possible within the limitations of our fundingTechnology continues to improve and to allow people to gather information from many sources easily.
We need to stay on top of user needs and adapt as much as possible within the limitations of our funding
11. Assessment Module. IFD MIS 2004 Why Evaluation and Assessment at UM Libraries?. cont. Improve organizational processes
Accountability to funding agencies
Institutional or program accreditation
Political benefits of user involvement
Evaluation and assessment focus on user outcomes We need to continue to review our internal processes to assure that we can meet needs of staff and users.
Funding agencies want to be assured that we are meeting our goals and objectives.
Because evaluation and assessment focus on user outcomes and improving them we can benefit politically if we involve the user.We need to continue to review our internal processes to assure that we can meet needs of staff and users.
Funding agencies want to be assured that we are meeting our goals and objectives.
Because evaluation and assessment focus on user outcomes and improving them we can benefit politically if we involve the user.
12. Assessment Module. IFD MIS 2004 Examples of What We Want to know in a User Needs Assessment
Who are our customers (and potential customers)?
What are their teaching, learning and research interests?
What are their needs for library services and resources?
How aware are they of library services and resources?
13. Assessment Module. IFD MIS 2004 Examples of What We Want to know in a User Needs Assessment How do they currently use library/information resources?
How would they prefer to do so?
How does the library add value to their work?
Why do they use other search engines, i.e., Yahoo, Google?
14. Assessment Module. IFD MIS 2004 Sources of Information for Assessment and Evaluation Library Customers Faculty, Students Staff
Automated Library System
Anecdotal data
Literature Searches
Historical data Surveys, focus groups, complaints, requests for information
Provides data on collection usage, users
Information from staff on encounters with users
Research on what others libraries have assessed
Previous informationSurveys, focus groups, complaints, requests for information
Provides data on collection usage, users
Information from staff on encounters with users
Research on what others libraries have assessed
Previous information
15. Assessment Module. IFD MIS 2004 Collecting Evaluation Data: An Overview of Methods
Quantitative
Qualitative
Mixed Method Quantitative methods are those which focus on numbers and frequencies rather than on meaning and experience. Quantitative methods (e.g. experiments, questionnaires and psychometric tests) provide information which is easy to analyze statistically and fairly reliable. Quantitative methods are associated with the scientific and experimental approach and are criticized for not providing an in depth description.
Qualitative methods are ways of collecting data which are concerned with describing meaning, rather than with drawing statistical inferences. What qualitative methods (e.g. case studies and interviews) lose on reliability they gain in terms of validity. They provide a more in depth and rich description.
Mixed Methods combination of qualitative and quantitative approaches, which allow statistically reliable information obtained from numerical measurement to be backed up by and enriched by information about the research participants' explanations Quantitative methods are those which focus on numbers and frequencies rather than on meaning and experience. Quantitative methods (e.g. experiments, questionnaires and psychometric tests) provide information which is easy to analyze statistically and fairly reliable. Quantitative methods are associated with the scientific and experimental approach and are criticized for not providing an in depth description.
Qualitative methods are ways of collecting data which are concerned with describing meaning, rather than with drawing statistical inferences. What qualitative methods (e.g. case studies and interviews) lose on reliability they gain in terms of validity. They provide a more in depth and rich description.
Mixed Methods combination of qualitative and quantitative approaches, which allow statistically reliable information obtained from numerical measurement to be backed up by and enriched by information about the research participants' explanations
16. Assessment Module. IFD MIS 2004
Quantitative Methods of Data Collection
User Surveys
Critical Incident Technique
Tests
Qualitative Methods of Data Collection
Focus Groups
Interviews
Observation
Content Analysis
User surveys are flexible tools for data collection They may also provide quantitative data LibQUAL+ Give examples of surveys
Types of surveys include- Questionnaires, Email and internet surveys, Telephone surveys
Advantages
can complete anonymously-inexpensive to administer-easy to compare and analyze-administer to many people-can get lots of data-many sample questionnaires already exist
Challenges
might not get careful feedback-wording can bias client's responses-are impersonal-in surveys, may need sampling expert- doesn't get full story
Critical Incident Technique- UM Libraries have used this method with Commuters.
Tests-We have participated in the SAILS project which is designed to test the information literacy skills of students.
focus groups Library focus groups
Advantages
quickly and reliably get common impressions -can be efficient way to get much range and depth of information in short time- can convey key information about programs
Disdantanges
can be hard to analyze responses-need good facilitator for safety and closure-difficult to schedule 6-8 people together
Interviews
get full range and depth of information-develops relationship with client-can take much time-can be hard to analyze and compare-can be costly-interviewer can bias client's respon n be flexible with client
Observation
view operations of a program as they are actually occurring-can adapt to events as they occur
can be difficult to interpret seen behaviors-can be complex to categorize observations-can influence behaviors of program participants-can be expensive
User surveys are flexible tools for data collection They may also provide quantitative data LibQUAL+ Give examples of surveys
Types of surveys include- Questionnaires, Email and internet surveys, Telephone surveys
Advantages
can complete anonymously-inexpensive to administer-easy to compare and analyze-administer to many people-can get lots of data-many sample questionnaires already exist
Challenges
might not get careful feedback-wording can bias client's responses-are impersonal-in surveys, may need sampling expert- doesn't get full story
Critical Incident Technique- UM Libraries have used this method with Commuters.
Tests-We have participated in the SAILS project which is designed to test the information literacy skills of students.
focus groups Library focus groups
Advantages
quickly and reliably get common impressions -can be efficient way to get much range and depth of information in short time- can convey key information about programs
Disdantanges
can be hard to analyze responses-need good facilitator for safety and closure-difficult to schedule 6-8 people together
Interviews
get full range and depth of information-develops relationship with client-can take much time-can be hard to analyze and compare-can be costly-interviewer can bias client's respon n be flexible with client
Observation
view operations of a program as they are actually occurring-can adapt to events as they occur
can be difficult to interpret seen behaviors-can be complex to categorize observations-can influence behaviors of program participants-can be expensive
17. Assessment Module. IFD MIS 2004 Planning, Assessment, Outcomes Assessment Planning should be based on a mission statement and goals often found in strategic plans.
Assessment should be comprehensive and involve all library users
Outcomes Assessment should address the accountability of the library. ACRL has developed standards for planning, assessment and Outcome Assessments. The bullets paraphrase the first sentence of each standard. I will be addressing what the library has done and can do to meet these standards.
The Libraries have a mission statement and a strategic plan as well as other planning documents. We have conducted survey based on goals in the plan and we have continued to evaluate, update and refine the plan.
Can be an active mechanism for improving current library practices. The focus of outcomes assessment is on the achievement outcomes or indicators the library has identified in its goals and objectives.GIVE EXAMPLE Here
Identifies performance measures that indicate how well library is doing what it sets out to do.ACRL has developed standards for planning, assessment and Outcome Assessments. The bullets paraphrase the first sentence of each standard. I will be addressing what the library has done and can do to meet these standards.
The Libraries have a mission statement and a strategic plan as well as other planning documents. We have conducted survey based on goals in the plan and we have continued to evaluate, update and refine the plan.
Can be an active mechanism for improving current library practices. The focus of outcomes assessment is on the achievement outcomes or indicators the library has identified in its goals and objectives.GIVE EXAMPLE Here
Identifies performance measures that indicate how well library is doing what it sets out to do.
18. Assessment Module. IFD MIS 2004 Two Assessment Methods Balanced Scorecard(BSC)
Outcomes Based Evaluation(OBE)
The Libraries are partnering with CLIS and a team of graduate students from its masters of information management program to develop the Balanced Scorecard for the Libraries, hopefully beginning in the fall
We are currently using OBE to evaluate the Learning curriculumThe Libraries are partnering with CLIS and a team of graduate students from its masters of information management program to develop the Balanced Scorecard for the Libraries, hopefully beginning in the fall
We are currently using OBE to evaluate the Learning curriculum
19. Assessment Module. IFD MIS 2004 Balanced Scorecard
Customer Perspective
Financial Perspective
Learning and Growth Perspective
Internal Perspective What is Balanced Scorecard?
Balanced Scorecard (BSC) is a concept that assists organizations to translate strategy into action. BSC begins with the company vision and strategies and defines critical success factors. Measures are constructed that aid target-setting and performance measurement in areas critical to the strategies., Balanced Scorecard is a performance measurement system, derived from vision and strategy, and reflecting the most important aspects of the business. The Balanced Scorecard (BSC) concept supports strategic planning and implementation by federating the actions of all parts of an organization around a common understanding of its goals, and by facilitating the assessment and upgrade of strategy.
Traditional performance measurement, focusing on external accounting data, was quickly becoming obsolete and something more was needed to provide the information age enterprises with efficient planning tools. For this purpose Kaplan & Norton introduced four different perspectives from which a company's activity can be
evaluated:
ˇ Financial perspective (how do we perceive our shareholders
ˇ Customer perspective (how do we perceive our customer
ˇ Process perspective (in what processes should we excel to succeed)
ˇ Learning and innovation perspective (how will we sustain our ability to change and improve?)
Benefits
The benefits of applying Balanced Scorecard can be summarized as follows:
ˇ Balanced Scorecard helps align key performance measures with strategy at all levels of an organization.
ˇ Balanced Scorecard provides management with a comprehensive picture of business operations.
ˇ The methodology facilitates communication and understanding of business goals and strategies at all levels of an organiza tion
ˇ The balanced scorecard concept provides strategic feedback and learning.
ˇ Balanced Scorecard helps reduce the vast amount of information the company's IT systems process into essentials.What is Balanced Scorecard?
Balanced Scorecard (BSC) is a concept that assists organizations to translate strategy into action. BSC begins with the company vision and strategies and defines critical success factors. Measures are constructed that aid target-setting and performance measurement in areas critical to the strategies., Balanced Scorecard is a performance measurement system, derived from vision and strategy, and reflecting the most important aspects of the business. The Balanced Scorecard (BSC) concept supports strategic planning and implementation by federating the actions of all parts of an organization around a common understanding of its goals, and by facilitating the assessment and upgrade of strategy.
Traditional performance measurement, focusing on external accounting data, was quickly becoming obsolete and something more was needed to provide the information age enterprises with efficient planning tools. For this purpose Kaplan & Norton introduced four different perspectives from which a company's activity can be
evaluated:
ˇ Financial perspective (how do we perceive our shareholders
ˇ Customer perspective (how do we perceive our customer
ˇ Process perspective (in what processes should we excel to succeed)
ˇ Learning and innovation perspective (how will we sustain our ability to change and improve?)
Benefits
The benefits of applying Balanced Scorecard can be summarized as follows:
ˇ Balanced Scorecard helps align key performance measures with strategy at all levels of an organization.
ˇ Balanced Scorecard provides management with a comprehensive picture of business operations.
ˇ The methodology facilitates communication and understanding of business goals and strategies at all levels of an organiza tion
ˇ The balanced scorecard concept provides strategic feedback and learning.
ˇ Balanced Scorecard helps reduce the vast amount of information the company's IT systems process into essentials.
20. Assessment Module. IFD MIS 2004 Outcome Based Evaluation Outcome Based Evaluation (OBE) is a systematic way to assess the extent to which a program has met its intended results.
OBE focuses on key questions such as:
How are the lives of the program participants better as a result of my program?
What were the measurable outcomes, results, and performance from the programs? Simply put, did the program work?
How has my program made a difference?
Outcomes evaluation looks at programs as systems that have inputs, activities/processes, outputs and outcomes -- this system's view is useful in examining any program! IMLS is requiring Outcomes based evaluation as part of its grant request process. It has a Logic Model which must be filled out which requires basic elements. I have a copy of one here for you to see.
Planning for outcome based evaluation follows basic steps
Choosing Outcomes what do expect to happen as a result of program or service
Start with short-term outcomes 0-6 months
intermediate outcomes, think 3-9 months
long-term outcomes, think 6-12 months
2. Selecting Indicators
Identify at least one indicator per outcome (note that sometimes indicators are called performance standards)
When selecting indicators, ask:-- What would I see, hear, read about clients that means progress toward the outcome?
3: Getting Data/Information
Identify Data Sources and Methods to Collect Data
For each indicator, identify what information you will need to collect/measure to assess that indicator.
Consider:
-- Current program records and data collection
it practical to get that data?-- What will it cost?-- Who will do it?-- How can you make the time?
When to collect data?-- Depends on indicator-- Consider: before/after program, 6 months after, 12 months after
Pretest your data collection methods (eg, have a few staff quickly answer the questionnaires to ensure the questions are understandable)
4. Analyzing/Reporting Your Evaluation Results
Reporting should be based on your audience
Staff should have a chance to review results and discuss them before they are widely distributed.Outcomes evaluation looks at programs as systems that have inputs, activities/processes, outputs and outcomes -- this system's view is useful in examining any program! IMLS is requiring Outcomes based evaluation as part of its grant request process. It has a Logic Model which must be filled out which requires basic elements. I have a copy of one here for you to see.
Planning for outcome based evaluation follows basic steps
Choosing Outcomes what do expect to happen as a result of program or service
Start with short-term outcomes 0-6 months
intermediate outcomes, think 3-9 months
long-term outcomes, think 6-12 months
2. Selecting Indicators
Identify at least one indicator per outcome (note that sometimes indicators are called performance standards)
When selecting indicators, ask:-- What would I see, hear, read about clients that means progress toward the outcome?
3: Getting Data/Information
Identify Data Sources and Methods to Collect Data
For each indicator, identify what information you will need to collect/measure to assess that indicator.
Consider:
-- Current program records and data collection
it practical to get that data?-- What will it cost?-- Who will do it?-- How can you make the time?
When to collect data?-- Depends on indicator-- Consider: before/after program, 6 months after, 12 months after
Pretest your data collection methods (eg, have a few staff quickly answer the questionnaires to ensure the questions are understandable)
4. Analyzing/Reporting Your Evaluation Results
Reporting should be based on your audience
Staff should have a chance to review results and discuss them before they are widely distributed.
21. Assessment Module. IFD MIS 2004 What to do if you are ready for assessment The Management Information Systems Office (MIS) and the Library Assessment Review Committee (LARC) are available to assist you.
Irma Dillon, Manager 405 9113
http://www.lib.umd.edu/STAFF/PAS/MIS/index.html
Contact Us!