290 likes | 371 Views
Project SAILS: Facing the Challenges of Information Literacy Assessment. Julie Gedeon Carolyn Radcliff Rick Wiggins Kent State University EDUCAUSE 2004 Conference Denver Colorado. What is information literacy?.
E N D
Project SAILS: Facing the Challenges of Information Literacy Assessment Julie Gedeon Carolyn Radcliff Rick Wiggins Kent State University EDUCAUSE 2004 Conference Denver Colorado
What is information literacy? • Ability to locate, access, use, and evaluate information efficiently and effectively. • Guiding document: “Information Competency Standards for Higher Education” – Association of College & Research Libraries(http://www.ala.org/ala/acrl/acrlstandards/informationliteracycompetency.htm)
Our questions • Does information literacy make a difference to student success? • Does the library contribute to information literacy? • How do we know if a student is information literate?
The Idea of SAILS • Perceived need – No tool available • Project goal – Make a tool: • Program evaluation • Valid • Reliable • Enables cross-institutional comparison • Easy to administer for wide delivery • Acceptable to university administrators
Project parameters • Test • Systems design approach • Measurement model – Item Response Theory • Tests cohorts of students (not individuals) • A name • Standardized Assessment of Information Literacy Skills
The project structure • Kent State team • Ohio Board of Regents collaborative grant with Bowling Green State University (part for SAILS) • IMLS National Leadership Grant • Association of Research Libraries partnership • www.projectsails.org
Technical components • Environment • Item builder • Survey builder • Survey generator • Report generation • Challenges
Environment • Linux (Red Hat) • Apache • MySQL • PHP
Survey process • Create survey questions (items) • Create survey for this phase • Add schools for this phase • Schools create web front-end • Collect data
Redirection to SAILS web site Parameters passed: • Unique student identifier • School code • Authorization code
Report process • Send schools unique identifiers • Upload demographics • Scan & upload paper surveys • Generate entire dataset file • Offline IRT analysis • Upload IRT results • Generate reports
Technical challenges • Creation of the front-end • Customizations for schools • Automating the data analysis • Supporting different languages
Data analysis • Item Response Theory • Measures ability levels • Looks at patterns of responses • For test-takers • For items (questions) • Based on standards and skill sets • Show areas of strength and areas of weakness
Status • Instrument • 126 items developed, tested, and in use • Web-based and paper-based administration • Grant Project - IMLS • Phase I complete - 6 institutions • Phase II complete - 34 institutions • Phase III began June 2004 - 77 institutions
Next steps for SAILS • Analyze data and other input • Administrative challenges • Self reported demographic data • Testing environment • Report generation • Does the instrument measure what we want it to? • Are institutions getting what they need?
Summary • Vision: • Standardized, cross-institutional instrument that measures what we think it does • To answer the questions: • Do students gain information literacy skills? • Does information literacy make a difference to student success?
For more information • www.projectsails.org • sails@kent.edu • Julie Gedeon, jgedeon@kent.edu • Carolyn Radcliff, radcliff@kent.edu • Rick Wiggins, rwiggins@lms.kent.edu • Mary Thompson, project coordinator mthomps1@kent.edu; 330-672-1658