700 likes | 880 Views
User Interfaces for Ubiquitous Computing II. Novel Interactions. Some materials inspired from Stanford CS376 + presentations from N. Roussel and M. Berstein. Overview. Computing by the inch, foot, & yard Away from the PC Implications of Ubicomp technologies on interaction
E N D
User Interfaces for Ubiquitous Computing II • Novel Interactions Some materials inspired from Stanford CS376 + presentations from N. Roussel and M. Berstein
Overview • Computing by the inch, foot, & yard • Away from the PC • Implications of Ubicomp technologies on interaction • Questions for Ubicomp : attention, sensing, challenges... • Technicalities
Tabs • “Tabs are the smallest compo-nents of embodied virtuality. Because they are intercon-nected, tabs will expand on the usefulness of existing inch-scale computers such as the pocket calculator and the pocket organizer. Tabs will also take on functions that no computer performs today.” • M. Weiser • Smallest components • Interconnected • Embedded applications
Pads • “Pads differ from conventional portable computers in one crucial way. Whereas portable computers go everywhere with their owners, the pad that must be carried from place to place is a failure. Pads are intended to be ‘scrap computers’ (analogous to scrap paper) that can be grabbed and used any-where; they have no individualized identity or importance.” • M. Weiser
Boards • “We have built enough Liveboards to permit casual use: they have been placed in ordinary conference rooms and open areas, and no one need sign up or give advance notice before using them. By building and using these boards, researchers start to experience and so understand a world in which computer interaction casually enhances every room.” • M. Weiser Xerox’s Liveboard with Tivoli application http://www.vaeggen.copenhagen.dk/
Technical challenges • What is different from a PC? • Input • Output • Activities and context
“The future is already here. It is just not uniformly distributed” • —William Gibson
Beyond mouse and keyboard • Mobile devices • Augmented reality • Surface computing • Ambient technologies • Tangibles and Wearables Natural?*
Air traffic control • Let’s make it digital?
Paper flight strips • Are flexible • Allow rule-breaking • Support communication • Do not interrupt
The Myth of the Paperless Office • Why haven’t we thrown out all of our paper? • As information technology has grown, so has the amount of paper • We use paper in conjunction with technology; we’re not replacing it
Pen and Paper Computing • A wide range of technology • tablets • e-ink and flexible displays • Anoto/Livescribe • Stepping back • Situated actionGo beyond planned activities; let users decide how to act in unforeseen circumstances.
Mobile UI • Attention • 1 handed interaction • Limited space • Stepping back: • Rythm and routinesTake advantage of routine activities and spatial patterns to help users integrate the system into their daily lives. Rob Haitani from Palm
Example : Foursquare • User control • Plausible deniability • Gaming • Must handle users’ behavior
Tabletops and multitouch interaction • Slide from N. Rousssel
Interactive surfaces NYU - Perceptive pixel (FTIR) • Focus (+ context?) • Lack of overview • Shared display • Collaboration • Space control • Input management • Stepping back • Distributed cognitionConsider how other people or objects in the world can reduce the cognitive load for memory or communication tasks
Augmented reality • Mixing the digital and the physical • capture • consistency • Stepping back • Embodied interactionReady at handReady to hand • Wellner’s Digital Desk Urp
Tangible interfaces • Ishii’s bottles • Radically new but always familiar • Everyday (no actuation) • Actuated • Ambient • Stepping back • Co-adaptationExpect users to re-interpret and customize technology, help them capture and share those customizations
Ambient technologies • Moving between • background and, • foreground of attention • Stepping back • Peripheral awarenessDesign for center and periphery, Allow users to vary their degree of engagement.
November 11-12 • Sketching in hardware • back to the basics with amateur electronics, • links between the digital and the physical.
Input-Output loop • Modes: visual, auditory, haptic... • Scale: wearable, mobile, office, city, ambient...
Ubicomp technologies and Collaboration • Most of Ubicomp being distributed it also involves collaboration Wikipedia. Johansen, 1988 in Baecker, R.M.; Others, (1995). Readings in human-computer interaction: toward the year 2000. Morgan Kaufmann Publishers.
Multimodality • Definitions: • Multimodal generally refers to an interface that can accept input from two or more combined modes. • Multimedia generally refers to an interface that produces output in two or more modes. • Multimodal vs. sensor fusion? David West, Aaron Quigley, Judy Kay Memento: A Digital Physical Scrapbook for Memory Sharingin Journal of Personal and Ubiquitous Computing Special
Modalities • Input: • mouse • pen • speech • audio (non-speech) • tangible object manipulation • gaze, posture, body-tracking • Output • Visual displays • Haptics: Force Feedback • Audio • Smell • Taste
Motivations • Hands busy / eyes busy • Mutual disambiguation • Faster input • More “natural” • Anthropomorphism?
Input-Output loop • Modes: visual, auditory, haptic; • Scale: wearable, mobile, office, city, ambient; • How to move from one device to the next? ex: Augmented surfaces Recombinant computing Plasticity
Seamless From M. Weiser, UIST’94Building Invisible Interfaces • Is a seamless building one in which you never notice as you move from place to place? • Making everything the same is easy; • Hard is letting everything be itself, with other things • Goal: seamful systems, with beautiful seams see Chalmers, M. and Maccoll, I. (2003)Seamful and seamless design in ubiquitous computing
Considerations for Ubicomp & calm technologies • Ubicomp is about everyday distributed computation and connectedness. • The original idea was to get calm technology. • Not about fridges that order for you • Not about connecting your iphone to your water faucet
Ubiquitous computing design principles • Bliss • Distraction • Cognitive flow • Manuals • Transparency • Modelessness • Fear of interaction • Notifications see cognitive flow • Calming • Defaults Textbook p.245
Making Sense of Sensing Systems • When I address a system, how does it know I am addressing it? • When I ask a system to do something how do I know it is attending? • When I issue a command (such as save, execute or delete), how does the system know what it relates to? • How do I know the system understands my command and is correctly executing my intended action? • How do I recover from mistakes? Bellotti et al., CHI 2002 from Stanford CS376
Challenges of ubicomp • Challenge One: The “Accidentally” Smart Environment • Challenge Two: Impromptu Interoperability • Challenge Three: No Systems Administrator • Challenge Four: Social Implications of Aware Technologies • Challenge Five: Reliability • Challenge Six: Inference in the Presence of Ambiguity Edwards & Grinter, Ubicomp 2001
Novel Interactions • Let’s get technical
Implications for UI inputs handling • Event-based • Focus on widgets (or interactive elements) Tendency to manage presentation + behavior • State machines • Focus on interactions Ability to handle multiple interactions simultaneously. • Reactive / Flow-based • Cascading reactive devices, input-reconfiguration. Real-time settings
Architecture example I • Event oriented • most GUI • The Event Heap from the iRoom • Paper toolkit [Joahnson & Fox, 2002] [Yeh et al, UIST’08]
Architecture example I • 1 public class SimplePaperApp { • 2 private static int numStrokes = 0; • 3 public static void main(String[] args) { • 4 Application app = new Application(); • 5 Sheet sheet = app.createSheet(8.5, 11); • 6 Region inkReg = sheet.createRegion(1, 1, 4, 4); • 7 Region tapReg = sheet.createRegion(1, 6, 1, 1); • 8 Device remote = app.createRemoteDevice(); • 9 inkReg.addEventHandler( • 10 new InkHandler() { • 11 public void handleInkStroke(InkEvent e) { • 12 numStrokes++; • 13 }}); • 14 tapReg.addEventHandler( • 15 new ClickAdapter() { • 16 public void clicked(PenEvent e) { • 17 remote.displayMessage(numStrokes); • 18 }}); • 19 app.run(); // starts event dispatch loop • 20 } • 21 }
Architecture example II • State machines • Instrumental interaction and state machines • State machines in QT 4.6 • VIGO: Instrumental Interaction in Multi-Surface Environments
Instrumental interaction • Definition: • “An instrument is a mediator or two-way transducer between the user and domain objects. The user acts on the instrument,which transforms the user’s actions into commands affecting relevant target domain objects.”