610 likes | 615 Views
Explore the elements of good interface design for digital libraries, with a focus on video and audio file libraries. Consider examples from usability evaluation and learn how to display large volumes of data effectively. Evaluation methods, user-centered design, and usability inspection techniques will be discussed.
E N D
Goals • Discover elements of good interface design for digital libraries of various sorts • Consider examples from DL usability evaluation as sources of insight. • Look at the distinct requirements of interfaces to libraries of video and audio files
Caveat • Note -- • We have a whole course in User System Interface • Everything in that class is relevant to User Interfaces for Digital Libraries • One evening will not replace that course, nor will it capture all of the relevant factors.
Note - to do later • At the end of the class, I will ask you to do a reflection on the points raised. You will be asked to summarize the most important characteristics of a well-developed DL interface. • As you continue your DL projects, be sure to apply the relevant components of these elements.
The challenge • A user interface for digital libraries must display large volumes of data effectively. • Typically the user is presented with one or more overlapping windows that can be resized and rearranged. • In digital libraries, a large amount of data spread through a number of resources necessitates intuitive interfaces for users to query and retrieve information. • The ability to change the user's perspective from high-level summarization information down to a specific paragraph of a document or scene from a film remains a challenge to user interface researchers. Source: http://cimic.rutgers.edu/ieee_dltf.html
Expectations of Digital Libraries • Provide at least those services available in traditional libraries • … and more. • A system is successful “only to the degree to which the vast majority of its intended users are able to use its intended functionality” Hill 97
User-centered design • “User-centered design for a digital library must include not only systems evaluation but also an understanding of the process of information seeking and use.” • Compared to a “self-evident door handle” -- once you see it, you know what it does and how to use it. No instruction is necessary. Hill 97
Methods of evaluation • Surveys • Target user groups • Focus groups from the intended audiences • another recommendation: faux focus groups • When it is not practical to do a real focus group for a while, the developers do some role playing, pretend to be users • What do you think of this approach? • Ethnographic studies • Audio/video taped sessions of users • Analysis of feedback and comments • Demographic analysis of beta tester registration data • Log analysis Hill 97
Usability inspection of Digital Libraries • To produce a product with high usability • Client and user interviews • Task analysis • User class definitions • Usage scenarios • Iterative usability design • Prototyping • Design walk-throughs • Usability evaluation Unfortunately, developers often look at usability analysis as something to do at the end of the development process as a final test, rather than as a part of the design process. Source: Hartson 04
Your plans • How will you evaluate the usability of your digital library? • What is ideal? • What is practical? • What do you plan to do?
Evaluation • Evaluation for any purpose has two major components • Formative • During development, spot check how things are progressing • Identify problems that may prevent goals from being achieved • Make adjustments to avoid the problems and get the project back on track • Summative • After development, see how well it all came out • Lessons learned may be applicable to future projects, but are too late to affect the current one. • Needed for reporting back to project sponsors on success of the work.
Usability evaluation • Lab-based formative evaluation • Real and representative users • Benchmark tasks • Qualitative and quantitative data • Leads to redesign where needed • After deployment • Real users doing real tasks in daily work • Summative with respect to the deployed system • Useful for later versions
Usability inspection • Lower cost option than full lab-based testing • Applies to early designs, well-developed designs, and deployed systems • Does not employ real users • Expert based • Usability engineering practitioners • May be guided by typical user tasks • Seeks to predict usability problems that users will encounter. Hartson 04
Inspection categories • User classes • Know your user • Example from the cited study: • Scientific researchers in computer science • Administrators • Do not use the regular interface, so not evaluated • User tasks • Search for technical reports on a set of criteria • Browse the collection • Register • Submit • Harvest Hartson 04
Search expanded • Search options • Simple search • All bibliographic fields • Group results by archive • Sort • Advanced search • Focus on specific fields with filter options Hartson 04
Results - 1 • Submit and Harvest tasks not evaluated • Specialized domain requirements • Need evaluation with real users to do meaningful testing • Report on Problems Found • Usability problem types • Wording, consistency • Functionality • Search and browse functionality • Problem = anything that impacts the user’s task performance or satisfaction. Hartson 04
Categories of Problems • General to most applications, GUIs • Wording • Consistency • Graphic layout and organization • User’s model of the system • Digital Library functionality • Browsing • Filtering • Searching • Document submission functions Hartson 04
Wording • About 36% of the problems in the case described in the paper • “Precise use of words in user interfaces is one of the most important design considerations for usability” • Clear, complete, correct • Button and tab labels • Menu choices • Web links • Crucial to help users learn and understand functionality • Easiest problems to fix if someone with right skills is on the team. Hartson 04
Search and Browse functionality • Pretty basic to what a DL does! • 18% of the problems were in that area. • Designers consider these separate functions • Users see them as extremely closely related • Find the desired resource • Should be designed together Hartson 04
“Usual Suspects” • Digital libraries prone to the same design faults as other interactive systems • Consistency • In the example, “group” and “archive” were used interchangeably • Different labels for the same concept used in different places • Simple search on tab, Search all bibliographic field at function location • Multiple terms referring to the same concept confuse users, slow learning • Standardize terminology and check it carefully Hartson 04
Usual Suspects - 2 • Problems with Feedback • Clearly indicate where the user is in the overall system • Clicking a tab does not result in highlighting or any kind of feedback about which tab is the currently active choice. • Selected institution (archive) highlighted when chosen, but not maintained after some other actions. Hartson 04
Usual suspects - 3 • Wording • Use of jargon or slang, or unclear or missing labels • Challenge for users • Example in NCSTRL • Several dates used. The labels for the dates do not clearly described what each represents. • “discovery date” which is different from “accession date” • Discovery date -- probably a developers term, and not likely to be of interest to the user. • Use terms that are meaningful to users without explanation whenever possible. Resist presenting data that is not useful for user purposes. Hartson 04
Usual suspects - 4 • Wording, continued • Example: “Submit to CoRR” tab • Could be “Submit Technical Report(s) to CoRR • Example: Search all bibliographic fields • Could be “Simple Search: Search all bibliographic fields in selected archive (or for selected institution)” • Other examples of unclear labels • Archive’s Set - technical term from OAI-PMH • DateStamp • Discovery Date • Label for the user, not the developer Hartson 04
Usual Suspects - 5 • Incorrect or inappropriate wording • “Search results” label for browsing results • hits (1-n) or total xxx hits displayed • Not search results, just reports available for browsing • Apparent use of combined code for browse and search. • Label results appropriately, even scrupulously, for their real meaning. Hartson 04
Usual suspects - 6 • Appropriate terms • Use of “hits” for individual search (or browse) results • Commonly used • Inappropriate slang, according to usability experts • Considered unattractive, even slightly offensive • Recommended: something like “Matches with search term” • Cosmetic consideration can have a positive affect on user’s impression of the site. Hartson 04
Layout and design • The whole point of a graphical user interface is to convey more information to the user in a short time. • The GUI must support the user needs • Example problems in the NCSTRL evaluation • Menu choices - no logical order • Reorganize by task or functionality • Organize task interfaces by categories to present a structured system model and reduce cognitive workload. Hartson 04
Layout example • Instead of randomly ordered tabs, group them by • Information links • About NCSTRL • About CoRR • OAI • Help • User tasks • Simple search • Advanced search • Browse • Register • Submit technical reports to CoRR Hartson 04
Graphical design • Proximity of elements suggests associations and relatedness • Search button very close to OR radio box • Applies equally to all parts of the dialog • Consider the implications of placement and association of graphical elements. Hartson 04
Start off right • Any application should have a home page that explains what the site is about and gives the user a sense of the overall site capability and use. • NCSTRL starts with the Simple Search page, with no introduction.
DL specific problems • Searching, filtering, browsing • User view: all are aspects of finding a needed resource • Developer view: differences based on what must go into an index to support searching, how filtering is combined with searching to form a new query, etc. • Usability suggestion: combine search, browse, filter into one selection and navigation facility. • Give users the power to combine these elements to serve their needs. Hartson 04
Iterative search • Search is often implemented as a one-shot function. • Users want to iterate on their query string to improve results • NCSTRL does not show the query that produced the given results. • Users want to prune the result set by applying a subsequent query to just those results • Not available in NCSTRL • Examples where it is available? Hartson 04
Browsing • NCSTRL allows browsing only by institutions (archive) • Other possibilities • Date • Author • Subject • Allow user activity that will serve user needs. Try to find out what users want before making decisions about services offered.
Portal • “A portal <is> a single point of access to distributed systems that provides services to support user needs to search, browse, and contribute content, often linking to shared existing functionality at other sites.” • Portal pass through problem • Does the portal add service, or just provide a link to a collection of other sites? Hartson 04
Portal - submission • NCSTRL - submission to CoRR • Link opens to another page, not directly offering the opportunity to submit. • Disconnect for the user between the original page and the action promised. • Link directly to the service offered without any intermediate pages unless needed in support of the service. Hartson 04
Summary for NCSTRL case • System exhibited many typical problems with user interfaces • Investigation illuminated some issues specific to digital libraries or other systems for retrieving information.
Ensemble – an early page design Your initial thoughts?
Thoughts? First serious revision
Recommendations • Based on our earlier review of usability characteristics, what is your advice to the Ensemble team?
Video Digital Libraries • Video digital libraries offer more challenges for interface design • Information attributes are more complex • Visual, audio, other media • Indicators and controlling widgets • Start, stop, reverse, jump to beginning/end, seek a particular frame or a frame with a specified characteristic Source: Lee 02
Video Interface Features • Cataloging • Semi-automatic tool • Manual tool • Threshold adjustable before automatic segmentation • Textual Query • Natural language (or keyword) • Category or keyword list browsing • Audio information for indexing, browsing • Intelligent frame selection • Video browsing • Text description • Transcript • Single keyframe • Storyboard • Option re granularity of keyframe set • Interactive hierarchical keyframe browser • Keyframe slide show • Video summary playing • Playback • Transcript + playback synch • Keyframe + playback synch • Text search + playback and/or keyframe synch Source: Lee 02
Common features for Video DLs • Most systems use a textual querying interface and few systems provide any form of visual query interface, probably indicating the need for further development in this area; • Most systems use keyframe(s) as their video browsing method; • Playback is provided in all listed systems, indicating that playback is regarded as a most important interface feature; • Whereas most systems provide more than one video browsing method (often transcript + playback and/or keyframe + playback), browsing aids such as synchronization between different browsing methods are not often facilitated. Source: Lee 02
Stages of Information seeking in Video Digital Libraries • Browsing and then selecting video programs (as a collection) • Querying within a video program (content querying) • Browsing the content of a video program • Watching (part of) a video program • Re-querying the video digital library and/or within a video program Source: Lee 02
Summarizing stages of information seeking and the interface elements that support them as described in four researchers’ work. Source: Lee 02
Granularity in Video Browsing • Abstraction • Reducing the information available to a manageable, usable subset • Traditional video & audio browsing • Sequential, single access point, linear nature of the medium • Constrained by time • Fast forward • Difficult to see the content • Need to return to the beginning to repeat search Source: Lee 02
A scenario • Directory with 100 (or 1000 or…) video files. • No information except the file name. • Maybe reasonable name, but not very descriptive • You want to find a particular clip from a party or a ceremony or some other event. • What are your options? • What would you like to have available? Spend a bit of time now talking about this.
Video Abstraction • Levels to present: (from Shneiderman 98) • Overview first • Zoom and Filter • Details on Demand • Example levels (from Christel 97) • Title: text format, very high level overview • Poster frame: single frame taken from the video • Filmstrip: a set of frames taken from the video • Skim: multiple significant bits of video sequences • Time reference • Significant in video • Options include simple timeline, text specification of time of the current frame, depth of browsing unit Source: Lee 02