310 likes | 399 Views
Using a Real World Scenario to Assess Information Competency Learning Outcomes. Lisa Kammerlocher Arizona State University’s West campus. 2002-03: Opportunity Knocks.
E N D
Using a Real World Scenario to Assess Information Competency Learning Outcomes Lisa Kammerlocher Arizona State University’s West campus
2002-03: Opportunity Knocks • A one-hour non-credit course targeted to the assessment of graduating seniors is approved for the Criminal Justice and Criminology Program • The library liaison is given permission to develop a learning outcomes assessment for information competency to administer during the course
Project Goals • Identify and assess learning outcomes of importance to faculty and the library • Design a performance-based measure that requires students to demonstrate both lower and higher order skills • To the degree possible… infuse the principles of assessing learning “in situ” by using a real world scenario • Jeanne Lave Cognition in practice: Mind, mathematics, and culture in everyday life.
What to Measure? Considerations • Information skills and knowledge students are exposed to in the CJC program • Skills and knowledge students need to use in their professional workplace • Faculty input: students should to be able to access and use empirical research articles, overviews written for a scholarly audience, laws and statistics • Standard of information competency that exist for ASU West and the profession • Testing Environment (time available)
The Assessment – Spring 2004 • Scenario based – students are gathering information to write a grant to develop a DUI prevention program • Locating and using statistics • Locating and using statutes • Identifying and locating 2 current overviews • Use a citation style • Identifying and locating 2 empirical research • Manage database records electronically • All answers available thru Library web page and some via the Web
Spring 2004 Methods and Scoring • Assessment and scoring rubric developed in consultation with an Ed Psych faculty member at ASU’s Tempe campus • 2 sessions in an electronic classroom w/ 2 monitors • 28 students • 4 versions of the assessment, each student answered 2 questions (14 respondents per question) • One librarian assisted in testing and refining the rubric • 2 librarians scored the results
Let’s Review the Assessment • See Pink Sheets
RESULTS – SPRING 2004Locating Two Quality Overviews on DUI, n=14 • 93% located one quality overview on topic • 57% locate two quality overviews on topic • 100% located one 2000+* • 71% located two 2000+* *not all of these sources were evaluated as being “quality”
RESULTS – SPRING 2004 Citing Materials Using APA Style, n=14 • 84% cited at least one item w/ 2 or fewer mistakes in APA style • 43% cited two items w/2 or fewer mistakes
RESULTS – SPRING 2004 Locating General, Classification and Sentencing Statues for DUI in AZ, n=14 • 36% located classification and sentencing statutes using a primary source • 14% located the general DUI statute using a primary source • 21% located information about the general statute using a secondary source
RESULTS – SPRING 2004 Empirical ArticlesInformation Management via E-mail, n=14 • 85% identified at least one empirical article on topic • 64% identified two • 79% could export their records via e-mail
Outcomes • Graduating students demonstrated some success in • identifying quality overviews, identifying empirical articles and managing database records via e-mail • Results imply that students • Employed effective search strategies • Chose an appropriate database or search engine to find the information needed • Evaluated search results to select items relevant to a topic • Applied criteria for identifying quality sources • Recognized clues from a database record to determine if an article is empirical
Issues Emerging from Spring 2004 Assessment • Questions about their search processes largely went unanswered • Was the poor performance on stats & law because of student deficiencies or the construction or difficulty of the question? • Uncertain why students could finding one overview or empirical article but not two • Time consuming to score and difficult to achieve inter-rater reliability. • The “double whammy” - needed to minimize instances in which one question was dependent on another • Was an sample of 14 too small?
Fall 2004Changes to Assessment • To increase the reliability of results the assessment was modified so that it could be completed by all students. • Reduced number of items student had to find • Took out statutes question because faculty indicated that finding statutes was not relevant to their program • To increase inter-rater reliability the questions were changed to elicit more finite responses that were easier to score • Ex. Defined what we meant by “quality overview”
Fall 2004 Methods and Scoring • Assessment and scoring rubric developed in consultation with an Ed Psych faculty member at ASU’s Tempe campus • 1 session in an electronic classroom w/ 2 monitors • 26 students answering all questions • 4 versions of the assessment, each student answered all questions • 3 librarians scored the results
Let’s Review the Assessment • See Blue Sheets
RESULTS – FALL 2004Generating Search Terms Locating a Quality 2000+ Overview, n=26 • 93% were able to identify alternative search terms for DUI • 77% were able to identify alternative search terms for recidivism • 20% were able to locate a quality overview from 2000+
RESULTS – FALL 2004Citation in APAEmpirical Articles, n=26 • 65% identified parts of a database record needed for an APA citation • 54% generated an APA citation from the database record with 1 or less mistakes • 62% located a record for an empirical article on topic • 65% managed the record via e-mail
RESULTS – FALL 2004Statistics, n=26 • 42% located current, relevant data • 12% located pre 2000, but relevant data • 19% used a primary source of data • 35% used a secondary source
RESULTS – FALL 2004Locating Resources, n=26 • 89% could ID the locations of a book in the ASU System and provide the call number • 50% could specify that an online article was available at ASU • 46% were able to print the first page of the article • 27% could accurately ID the locations of a print journal article in the ASU System and provide the call number
Spring 2005 Modifications • The simplicity embedded in Fall 2004 resulted in an assessment that was too quiz-like and less active for students • Desire to have the assessment reflect a more realistic situation that would occur in the workplace
Spring 2005 Methods and Scoring • 1 session in an electronic classroom w/ 3 monitors • Eliminated “double whammy” problem by telling the student to ask for a copy of the article or study if needed • 55 students • 1 version of the assessment, each student answered all questions • 3 librarians scored the assessment w/ a rubric
Let’s Review the Assessment • See Yellow Sheets
RESULTS - SPRING 2005 n=55 • 93% found online copy of article described in the scenario • 98% identified the government agency that authored a study on hard-core drinking drivers (HCDD) (a copy of the report was provided if they could not find it) • 13% located the recent study referred to in the trade article • 27% extracted a correct definition of HCDD from the article or study • 80% extracted the correct percentage of fatal crashes from DUI involving HCDD from the article or study
RESULTSSpring 2005, n=55 • 42% found an empirical article on a DUI prevention program • 83% of those who found an empirical article could identify the prevention program being described in the record or article (n=23) • 49% correctly identified (1 or fewer mistakes) the parts of a record for a book that would be used in an APA citation
Future Issues • Reintroduce multiple versions of the assessment to prevent cheating • The “research needs analysis” component of information competency should be assessed • Engagement of distracted faculty in the results and better integration of outcomes in the curriculum • Increased student accountability for their performance • Change the environment of the assessment – perhaps to a capstone course • Scalability • Try another program - Ethnicity, Race and First Nations Study degree program has potential opportunities
My Dream • True “in situ” assessment environment • Internship • Workplace • Problem-based assessment • Individualized • Programmatic IC Integration – • Consistent cooperation with faculty and department chairs • Results influence decision-making about teaching and learning • Empirically-based measurement of information competency
What’s Your Dream? Use the PURPLE paper at the back of this packet to share your ideas!
Acknowledgements • The following colleagues have kindly contributed their time and expertise to this project: • Dr. Sarah Brem, ASU College of Education • Mr. Dennis Isbell, ASUWc Fletcher Library • Ms. Leslee Shell, ASUWc Fletcher Library • Ms. Marisa Duarte, ASUWc Fletcher Library • Ms. Kathrine Henderson, ASUWc Fletcher Library • Mr. Joseph Buenker, ASUWc Fletcher Library • Mr. Richard Rivera, ASUWc Research Consulting Center
Contact Information Lisa Kammerlocher Fletcher Library, MC: 0152 Arizona State University’s West campus P.O. Box 37100 Phoenix, AZ 85069.7100 602.543.8510 lk@asu.edu http://westcgi/libcontrib/loex05/assessment/lkloex05.ppt