410 likes | 423 Views
This article discusses the challenges of designing user interfaces for universal computing devices and proposes an approach using multimodal interaction and context-based design. It also highlights the importance of prototyping tools for designing effective user interfaces.
E N D
http://guir.berkeley.edu Informal Tools for Multimodal, Context-based User Interface Design James A. Landay July 7, 1999 HCC Retreat
Natural Tides of Innovation ?? Innovation Universal Computing Integration Personal Computer Workstation Server Log R Minicomputer Mainframe 7/99 Time
Universal 1 : including or covering all or a whole collectively or distributively without limit or exception 2 a : present or occurring everywhere b : existent or operative everywhere or under all conditions <universal cultural patterns> 3 a : embracing a major part or the greatest portion (as of mankind) <a universal state> <universal practices> b : comprehensively broad and versatile <a universal genius> 4 a : affirming or denying something of all members of a class or of all values of a variable b : denoting every member of a class <a universal term> 5 : adapted or adjustable to meet varied requirements (as of use, shape, or size)
Specialized Devices Away From the “Average Device” • Powerful, personal capabilities from specialized devices • small, highly mobile or embedded in environment • “Intelligence” + immense storage and processing in the infrastructure • Everything connected Laptops, Desktops
HCI Challenges • Universal computing devices will not have the same UI as “dad’s PC” • a wide range of devices • often w/ small or no screens & alternative I/O • e.g., pens, speech, vision • special purpose to particular applications • “information appliances” • lots of devices per user • all working in concert
HCI Challenges (cont.) • Design of good appliances will be hard • single device design is easy • hard to design the same “application” in a consistent manner across many devices • e.g., calendar app.: one speech based & one GUI based • hard to make different devices work together • which device is used when? • multiple UIs & modes/device, which to “display”? • building awareness of context of use into design is the key to some of these issues • multimodal input is assumed, but little design support for creating multimodal interfaces
Our Approach • Build • novel applications on existing appliances • e.g., NotePals on the Palm PDA & CrossPad • new information appliances • Evaluate appliances in realistic settings • Iterate • use the resulting experience to build • more interesting appliances • better design tools & analysis techniques
Outline • HCI Challenges for Universal Computing • Multimodal interaction • Why is building MM UIs hard? • Best practices for designing GUIs • Proposed approach to building MM UIs • Using context to our advantage
Multimodal Interaction • When communicating with people we use more than one mode at a time! • gesture & speak • sketch & speak • etc. • Computers would be easier to use and more useful if they also worked like this
Benefits of Multimodal (MM) Interaction on Heterogeneous Devices • Computers could be used in • more situations • when hands are full or vision used for something else • more places • e.g., walking, car, factory floor, etc. • Interfaces would be easier to use • use innate perceptual, motor, & cognitive skills • More people would be able to use computers • including users w/ vision or motor impairments • MM UIs likely to become predominant
Why is building MM UIs hard? • Often require “recognition” technology • speech, handwriting, sketches, gesture, etc. • Recognition technology is immature • finally “just good enough” • single mode toolkits just appearing now • no prototyping tools • Hard to combine recognition technologies • still requires experts to build systems • few toolkits or prototyping tools! • This was the state of GUIs in 1980
Design Prototype Evaluate Best Practices for Designing GUIs • Iterative design • Prototyping tools are key to this success
Early Stage UI Design • Brainstorming • put designs in a tangible form • consider different ideas rapidly • Incomplete designs • do not need to cover all cases • illustrate important examples • Present several designs to client or design team • No need at this stage for “coding”
Prototyping Tools for Multimodal UI Design Should Support • Iterative design methodology • Informal techniques designers currently use in the early stage of UI design • sketching
Prototyping Tools for Multimodal UI Design Should Support • Iterative design methodology • Informal techniques designers currently use in the early stage of UI design • sketching • storyboarding
Landay Dictation Machine Prototyping Tools for Multimodal UI Design Should Support • Iterative design methodology • Informal techniques designers currently use in the early stage of UI design • sketching • storyboarding • “Wizard of Oz” |Um…. I’ll see you in the morning.
Our Approach: Sketches, Models & Context-aware Design Tools • Infer models from design “sketches”or other informal representations • model is an abstraction of apps UI design • model for representing contexts & UI implications • Use models to • semi-automatically generate UIs on diverse platforms • dynamically adapt a particular appliance UI to changing context of use
How to Specify Events • We have a good idea how to for visual UIs • visually! • But how about speech or gestures?
Specifying Non-Visual Events • How do designers do this now? • speech • scripts or grammars (advanced designers only) • flowcharts on the whiteboard • “Wizard of Oz” -> fake it! • gestures • give an example & then tell programmer what it does • We can do the same by demonstration (PBD) • demonstrate example of act (e.g., speech) • demonstrate result • system infers program • just a prototype, so doesn’t have to be too general
result of demonstrating a pen gesture for delete Specifying Non-Visual Events
Combining the Visual & Non-Visual • How do you see what the system inferred? • necessary for editing • generate a visual representation • flowchart seems like a start (common in speech UIs) • appropriate? what should it look like?
Type: User Stimulus Mode: Speech ARR_CITY Type: System Response Mode: Speech “Please name the departure city” Type: User Stimulus Mode: Speech DEP_CITY Type: Computation Lookup ARRIVAL_TIME in “flight times” using DEP_CITY, ARR_CITY A flowchart representing inferences made from the demonstration of a flight arrival time application Type: System Response Mode: Speech “The flight from DEP_CITY to ARR_CITY will arrive at ARRIVAL_TIME” Type: System Response Mode: Speech “DEP_CITY departures arriving to which city?” Specifying Non-Visual Events
Combining the Visual & Non-Visual • How do you see what the system inferred? • necessary for editing • generate a visual representation • flowchart seems like a start (common in speech UIs) • appropriate? what should it look like? • Combining visual & non-visual events • e.g., end-user dragging truck while saying “fast” • use a visual language that combines visual storyboard of GUI w/ flowchart for non-visual • VL better be simple...
Supporting Heterogeneous Devices • Consider sketches as an abstraction • Infer a “model” from the sketches • Use methods from model-based UI tools to • generate UIs for multiple devices • generate alternative modes for a single spec. on one device • Hard problems • how to abstract? • how do you generate a “good” UI? • keep the designer in the loop
Take Advantage of Context: Monitor Environment & Actions to Improve Interaction • Which devices are present & available? • there is a wall display -> use it for my wearable • device discovery
Take Advantage of Context: Monitor Environment & Actions to Improve Interaction • Which devices are present & available? • there is a wall display -> use it for my wearable • device discovery • What is occurring in the environment? • people are talking -> don’t rely on speech I/O • speech sensing • What is the state of the user? • hands using tools -> use speech I/O & visual Out • tangible tools or vision processing • Solution: UI design tools that understand context as well as multiple devices & modalities
Design Goals • Let designer’s rapidly produce “rough cuts” • doesn’t need to handle all cases • Allow end user testing & fast modification • Generate code that can help start UI for multiple devices • designer adds more detail & improves interaction • programmers add necessary code
What We’ve Accomplished So Far • Informal tools for UI design • sketch-based tools for GUI / Web design • built & tested 1st generation, building next gen. now
What We’ve Accomplished So Far • Informal tools for UI design • sketch-based tools for GUI / Web design • built & tested 1st generation, building next gen. now • informal tool for speech UI design • designed & implementation in progress
What We’ve Accomplished So Far • Informal tools for UI design • sketch-based tools for GUI / Web design • built & tested 1st generation, building next gen. now • informal tool for speech UI design • designed & implementation in progress • Automatic generation of simple control UIs • First cut designs for multimodal • UI design tool & appliance (SpeechCorder w/ ICSI) • Experience w/ appliances & simple context • NotePals
Take Home Ideas • Universal Computing is about supporting people • Success will require the design & evaluation of new appliances (device + app + UI) that • take advantage of natural modes of input • especially multimodal input! • take advantage of context • are used in realistic settings • Experience, new architectures, and new tools will make this design problem easier
http://guir.berkeley.edu Informal Tools for Multimodal, Context-based User Interface Design James A. Landay July 7, 1999 HCC Retreat
Research Plan • Finish implementation of informal tools • study usage (especially of speech UI design) • use results to design multimodal design tool • Develop algorithms for extracting app model • Build context-aware applications w/o tools • two testbeds to create & study • wirelessly networked PDAs in classroom/learning • extraction of tacit context using social networking • build taxonomy of contexts • how they should effect UI?
Research Plan (Cont.) • Implement tool for multimodal UI design • extract model & generates UI for 2 diverse platforms • uses simple context ques • Develop algorithms for capturing context • Evaluate usage (apps & tools) in target settings • Extend multimodal UI design tool • generate multi-platform UIs that dynamically adapt • allow context to be fully integrated in decisions
HCI Goals of Universal Computing • Some of the roots of Universal Computing are in the ideas of Mark Weiser • Ubiquitous/Pervasive/Invisible/Calm Computing • “... does not live on a personal device of any sort [PDA or dynabook], but is in the woodwork everywhere.” • “you don’t want personal technology, you want personal relationships”
HCI Goals of Universal Computing • Some of the roots of Universal Computing are in the ideas of Mark Weiser • Ubiquitous/Pervasive/Invisible/Calm Computing • “... does not live on a personal device of any sort [PDA or dynabook], but is in the woodwork everywhere.” • “you don’t want personal technology, you want personal relationships” • Universal Computing is about • supporting people’s tasks • most often includes working with other humans • making people’s lives easier • just creating ubiquitous technology does not solve this
E-mail Reports Design ideas Presentations Computers Support Human-Human Communication (HHC)
Traditional Software Interfaces • Force translations to formal representations • sometimes we want this (e.g., conference slides) • sometimes we don’t (e.g., creative tasks)
“Put me in a room with a pad & a pencil and set me up against a hundred people with a hundred computers -- I’ll outcreate every goddamn sonofabitch in the room.” -- Ray Bradbury, Wired 6.10 Traditional Representations • Rigid and unambiguous • hard to mix (e.g., few tools support rough sketches) • warp perceptions of the viewer and user • Increase time • encourage precision • Inhibit creativity • “tunnel vision”
Informal Communications Styles • Speaking • Writing • Gesturing • Sketching Informal UIs do not immediately translate natural input, allowing users to work more naturally