90 likes | 198 Views
Context-aware / Multimodal UI Breakout Summary. James A. Landay, et. al. HCC Retreat July 7, 2000. James Landay Anoop Sinha Jimmy Lin Trevor Perring Greg Heinzinger Chris Long Ed Chi. Christine Halverson Gian Gonzaga Ken Fishkin John Lowe Adam Janin Russell Eames Elin Pedersen.
E N D
Context-aware / Multimodal UI Breakout Summary James A. Landay, et. al. HCC Retreat July 7, 2000
James Landay Anoop Sinha Jimmy Lin Trevor Perring Greg Heinzinger Chris Long Ed Chi Christine Halverson Gian Gonzaga Ken Fishkin John Lowe Adam Janin Russell Eames Elin Pedersen Participants
Applications • Alert management • sites beacon context • “this is a quiet place, no interruptions please” • e.g., movie theater or restaurant • devices use context to avoid interruptions • wearable t-shirt that jams local cell phones! • “Elvis has left the meeting” • easily share documents from meetings • beam tokens of documents to participants or • use shared context to find docs later • e.g., I was in a meeting with Ken at Lake Tahoe, find docs
Context Events • Signal changes • like a windowing system • Can use as triggers to cause other actions • change my phone forwarding when I change locations • Can be immediate or logged for later tacit information mining
Context Implementation Issues • Apps need to share context easily • build-in to apps like cut & paste • context dial tone or infrastructure • Global file system • easier to share context & not have to transfer it • just use pointers • How to search / browse • computers are good at searching large spaces • humans good at making associations • Why not search with Google instead of browser history? • Google easier to get at & seems to work well
Context Toolkits / APIs / Refs • Bill Schilit’s Columbia / PARC Ph.D. • GA Tech GVU (Anind Dey) • IBM (Maria Ebling) • MIT (?) • ESPIRIT projects have looked at context • German? project according to Elin Pedersen
Interface Between Context & Multimodal UIs • “Context is just another kind of input” • different to the user, but similar to the system • user input caused by EXPLICIT user action • context is IMPLICT or DERRIVED • Context and multimodal UIs have similar privacy problems • natural inputs are human “readable” • may not want to share context or my input
Interface Between Context & Multimodal UIs • Context to choose output modality • e.g., user is in a meeting, don’t use speech • Context to disambiguate input(s) • help fusion: there is noise, don’t rely on speech • “the clutching problem” – infer user intent • Modality used to help inter context • e.g., talking to device -> user is alone?
Model Initial Design for Multimodal UI Design Tool • Create “rough cuts” • informal (sketching / “Wizard of Oz”) • iterative design (user testing/fast mods) • Infer models from design • designer can augment model over time • Generate initial prototypes • UIs for multiple devices • designer adds detail / improve UI • or even removes detail