540 likes | 743 Views
Evaluating and Assessing Library Instruction. Examples & Best Practices. Presenters. Martin Garnar – Regis University Rhonda Gonzales & Eleni Adrian – CSU-Pueblo Lorrie Evans – Auraria Library Patrick McCarthy - CSU. Evaluating and Assessing Library Instruction. Regis University
E N D
Evaluating and Assessing Library Instruction Examples & Best Practices
Presenters • Martin Garnar – Regis University • Rhonda Gonzales & Eleni Adrian – CSU-Pueblo • Lorrie Evans – Auraria Library • Patrick McCarthy - CSU
Evaluating and Assessing Library Instruction Regis University Martin Garnar
ACRL Guidelines for Evaluation and Assessment • program evaluation plan • criteria for evaluation • learning outcomes and assessment methods • coordination of assessment with teaching faculty • data gathering and analysis
Evaluating and Assessing Library Instruction CSU-Pueblo Eleni Adrian & Rhonda Gonzales
CSU-Pueblo Library Instruction Evaluation Overview • Librarians at CSU-Pueblo administered a brief evaluation form after each instruction session. • Collected these evaluations from 1999-2004. • Entered results into an Access Database. • Data Clean-up • Statistical Analysis • Conclusions
CSU-Pueblo Library Instruction Evaluation Evaluation Form Background questions Satisfaction questions Ranking question
CSU-Pueblo Library Instruction Evaluation Evaluation Forms Collected Fall 1999 through Summer 2004. 181 classes 2441 evaluation forms
CSU-Pueblo Library Instruction Evaluation Access Database Entered several semesters’ of data Revealed errors in our database design New database structure New input forms
CSU-Pueblo Library Instruction Evaluation Access Database
CSU-Pueblo Library Instruction Evaluation Access Database
CSU-Pueblo Library Instruction Evaluation Data Fixes: New Database Design Relationship between the classes and the evaluations tables. No answers for background questions.
CSU-Pueblo Library Instruction Evaluation Data Fixes: Form Changes Two questions added, two questions removed. Re-entered all old evaluation form data to match the final version. Double-checked that the form changes were accurately re-entered.
CSU-Pueblo Library Instruction Evaluation Data Fixes: Data Integrity Duplicate Class records Duplicate and blank Evaluations records Typos in Status (year in school) General data entry errors
CSU-Pueblo Library Instruction Evaluation Results – Background Questions Year in School
CSU-Pueblo Library Instruction Evaluation Results – Background Questions Attended Previous Session
CSU-Pueblo Library Instruction Evaluation Results – Background Questions Specific Assignment
CSU-Pueblo Library Instruction Evaluation Results – Background Questions Professor Participated
CSU-Pueblo Library Instruction Evaluation Results – Satisfaction Questions Start Research
CSU-Pueblo Library Instruction Evaluation Results – Satisfaction Questions Clear Presentation
CSU-Pueblo Library Instruction Evaluation Results – Satisfaction Questions Hands-on Experience
CSU-Pueblo Library Instruction Evaluation Results – Satisfaction Questions Respect
CSU-Pueblo Library Instruction Evaluation Results – Ranking Question Most Important Thing Learned
CSU-Pueblo Library Instruction Evaluation Comparisons
CSU-Pueblo Library Instruction Evaluation Comparisons
CSU-Pueblo Library Instruction Evaluation Conclusions • Distribution of responses skewed towards high end. • Student evaluations of faculty teaching across campus are not normally skewed towards the positive to this extent. • These results show that a large majority of students felt that our instruction sessions were helpful!
CSU-Pueblo Library Instruction Evaluation Conclusions - Meeting Our Goals? Since most of our sessions are introductory in nature, one of our primary goals is to show students how to start their research using our library. • 88.17% of students that took the survey indicated that they moderately or strongly agreed that the sessions helped them learn how to start their research. • 36.12% of students who took the survey responded that learning how to locate a journal article or book was the most important thing they learned.
CSU-Pueblo Library Instruction Evaluation Conclusions - Factors Affecting Results • What factors may have affected student responses? • Chi-square tests indicate that the following factors proved to be statistically significant: • Having a specific assignment in conjunction with this presentation was a significant factor on the number of students that strongly agreed with A & C • Attending a prior session was a slightly significant factor in determining how many students strongly disagreed with D • Survey Design: • Even number of response choices forced students to choose positive or negative • Unclear wording – “My Professor participated in this session.” • Lack of N/A forced students to choose a response even if it did not apply • Survey Administration • Administered at the end of each session, when students were in a hurry to leave. • Tendency of librarians may be to skip the survey if the session hasn’t gone well.
CSU-Pueblo Library Instruction Evaluation Indications for Best Practices Role of Evaluation: combine learning assessment with program evaluation Survey Design: concise, clearly worded, avoid mild statements, allow neutral and N/A responses, define hypotheses prior to creating the form
Evaluating and Assessing Library Instruction Auraria Library Lorrie Evans
Evaluation of instruction at Auraria Library Survey student learning in single session library classes (random selection). Survey students in full semester credit bearing classes (assignments, final project) Survey the faculty toward the end of the semester – after papers and assignment are turned in.
Challenges in the single session library class • Do the students understand the most important concepts? • No other chance for a follow up or review. • When students leave the room we will not see them again
Why? Look in the mirror – are we teaching what we think we are teaching. Collaborate with campus faculty on a pro-active level. Make library instruction a more integrated part of the academic culture. Accreditation – assessment mandates driven externally.
The tool – short online survey In class – three to five questions mapped to learning goals. Customized for each class Broadcast results, review answers & main concepts. Students share concerns. • Students see their answer along with classmate responses. • Librarian can respond to student input. • Review main points.
Sample results • English 1020 • Student comments view • Psychology freshman seminar data • Graduate Political Science - single session post test. • Graduate English Literature - semester course Jump past results
Comments return
Psych Freshman Seminar sample items return
Political Science post test return
Faculty survey -- end of the semester • Program effectiveness as reflected in quality of student projects and assignments. • Satisfaction with content, scheduling and teaching.
Faculty Evaluationsurvey results Zoomerang link
Feedback loop • In class survey – instant feedback for both students and librarian. Librarian can go over confusing concepts. • Results discussed at departmental meetings – used as source of information to make adjustments in teaching (more depth with less quantity). • Results made available to instructors and students. • End of the year report submitted to the campus Outcomes Assessment Advisory Committee. The committee provides a review of the report, recommends improvements.
Evaluating and Assessing Library Instruction CSU – Fort Collins Patrick McCarthy
Evaluating Learning Outcomes at the Reference Desk - CSU • Why? • Determine if learning takes place as a result of interactions at the reference desk • Identify those skills most suitable to reference desk instruction • Establish reasonable reference desk learning outcome benchmarks • Recommend behaviors and methods that lead to student learning
Categories of Reference Assessment • Accuracy response rate • User satisfaction • Course/instructor opinion survey • Learning outcomes
Methodology? • Quantitative approach • Powerful, not practical • Qualitative approach • Powerful, versatile
Procedures • Patron completes preliminary survey at completion of reference interaction • Researcher analyzes preliminary surveys and selects relevant patrons for follow-up assessment • Selected patrons are interviewed on specifics of reference transaction • Interview recordings are transcribed and coded for assessment