170 likes | 285 Views
2011 SAA Annual Meeting. Engaged! Innovative Engagement and Outreach and Its Assessment. Genya O’Gara SPECIAL COLLECTIONS RESEARCH CENTER. BACKGROUND. SPEC Kit 317: Special Collections Engagement.
E N D
2011 SAA Annual Meeting Engaged! Innovative Engagement and Outreach and Its Assessment Genya O’Gara SPECIAL COLLECTIONS RESEARCH CENTER
BACKGROUND SPEC Kit 317: Special Collections Engagement Examines activities meant to engage students, faculty, and other scholars/researchers with special collections. It investigates who coordinates these activities, where they are held, how they are promoted, and how they are evaluated.
BACKGROUND Special Collections in action
BACKGROUND The team
RESULTS …overall picture • Everyone (over 95%) is participating! • Survey themes include: • Subject knowledge determines responsibility • not job title • Lack of formal plans or documentation • Unstructured assessment • Integration with broader community • Embracing new technologies
RESULTS Exhibits • 99% of respondents are creating exhibits • Only 19% have a dedicated person/position • EVERYONE (who responded) is doing both physical and virtual • 50% attempt to evaluate success – primarily through counts
RESULTS Events • 96% of respondents are organizing events (lectures, symposia, open houses, etc.) • 56% note that primary responsibility for an event varies, depending on subject expertise • Majority measure success through attendance, followed closely by anecdotal feedback • ~33% have no evaluation at all • Most reported promotion used for events is one-on-one contact (92%) but only 24% consider it the most effective...
RESULTS Curricular Engagement • Everybody does it! • What “it” is? • work with faculty to develop courses • individual consultation • in-person class instruction • group consultation • course-related web pages • instructional videos • Rarely is there designated staff responsible for curricular engagement • Evaluation, generally, consists of use counts of material
RESULTS Curricular Engagement themes • A desire for methods, other than counts, to assess outcomes more effectively • Demand for instruction is growing as staff is shrinking • Lack of a designated coordinator
RESULTS Most common barriers • 67% of respondents report barriers in providing effective engagement • insufficient staffing • lack of dedicated staff • Funding, limited hours, and space grouped together as barriers • These are beyond our immediate control
RESULTS Survey conclusions Everybody is actively participating in instructional outreach, exhibits, and public programming Responsibilities for outreach often distributed among many, making it difficult to approach outreach in a cohesive manner Desire by respondents to move beyond patron use counts and anecdotal feedback in evaluation methods Despite roadblocks, widespread enthusiasm and understanding of the importance of engagement activities within archives at every level
All is not lost (the survey also showed…) • multiple levels of collaboration • outside the box engagement • desire for better evaluation • enhancement of traditional outreach through technology
Engagement ideas • Collaboration and early involvement seem to be key. Examples include: • Early contact with graduate student instructors • Working with subject and instruction librarians; incorporating with their classes • Awarding undergraduate research projects that make extensive use of collections • With students, creating virtual and physical exhibits to highlight materials • Involving students, through jobs, internships & practicums in all aspects of department work • Cross-departmental collaboration
Assessment Examples • Review student papers’ citations to evaluate the effectiveness of instruction • Using student focus groups to evaluate video tutorials • Monitoring books and articles published, performances given, and theses written • Tracking number and value of grants received • Examining web server statistics • Feedback forms and surveys • Monitoring number of graduate and practicum students using the collections • Soliciting and compiling one-on-one feedback from professors and students
Assessing our efforts How is evaluation and assessment being articulated and understood in the context of special collections? Special collections must clearly state engagement goals in order for any type of evaluation to be meaningful. Don’t have to reinvent the wheel – there are tools and examples out there: Archival Metrics China Missions Project Talk to library instruction colleagues and faculty
There is no one solution fits all... Targeted assessment is as important as targeted engagement
The End Questions or Feedback? Genya O’Gara Special Collections Research Center (919) 513-2605 genya_ogara@ncsu.edu