140 likes | 156 Views
Discover the process of evaluating cross-database search tools at UC and the interesting vendor and customer responses. Learn about the final product and the lessons learned throughout the evaluation process.
E N D
Evaluating Cross-database Search Tools Catherine Soehner, Head Christy Hightower, Engineering Librarian Science & Engineering Library University of California, Santa Cruz October 4, 2003 LITA National Forum
University of California • 10 campuses • 100’s of libraries • California Digital Library
Our Users Want • Easy, intuitive, fast All in one place • Instant full text
Cross-database Searching at UC • UC San Diego • Database Advisor • California Digital Library (CDL)
What We Wanted • Customized for each campus • Customized for each library • Better functionality • Duplicate removal • Sorting • Integration with full text and ILL • Panic Button
The Panic Button Idea • Designed to search: • Full Text • All subjects • Small number of databases • Undergraduate level
SearchLight Reloaded Project to evaluate commercial software to replace SearchLight
Process: Vendors • Reviewed Web documentation • Contacted each vendor • Set up demo • CDL Checklist • Questionnaire
Interesting Vendor Responses • Features R&D • Personalization, alert services, vocabulary assistance, deduping, merging • Technical infrastructure & programming R&D
Process: Customers • Contacted customers • Questionnaire
Interesting Customer Responses • Wish List • Interface Control • Vendor Responsiveness
Final Product • 30 page report evaluating 5 products • Summary chart with top 18 features • Final decision • Prototypes from top 3 vendors
7 Interesting Features • IP address authentication • Number of databases searched • Number of database categories allowed • Merging and de-duping • Sorting • Auto-check for non-functioning databases • Communication
Lessons Learned • Raise expectations • New market • Technology ready to meet our imagination • Ask pointed questions, such as: • Can your software do X today? • Keep asking!