1 / 66

User Interfaces for Ubiquitous Computing II

User Interfaces for Ubiquitous Computing II. Novel Interactions. Some materials inspired from Stanford CS376 + presentations from N. Roussel and M. Berstein. Overview. Computing by the inch, foot, & yard Away from the PC Implications of Ubicomp technologies on interaction

viveka
Download Presentation

User Interfaces for Ubiquitous Computing II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. User Interfaces for Ubiquitous Computing II • Novel Interactions Some materials inspired from Stanford CS376 + presentations from N. Roussel and M. Berstein

  2. Overview • Computing by the inch, foot, & yard • Away from the PC • Implications of Ubicomp technologies on interaction • Questions for Ubicomp : attention, sensing, challenges... • Technicalities

  3. Computing by the inch, foot, & yard

  4. Scale - continuity

  5. Tabs • “Tabs are the smallest compo-nents of embodied virtuality. Because they are intercon-nected, tabs will expand on the usefulness of existing inch-scale computers such as the pocket calculator and the pocket organizer. Tabs will also take on functions that no computer performs today.” • M. Weiser • Smallest components • Interconnected • Embedded applications

  6. Pads • “Pads differ from conventional portable computers in one crucial way. Whereas portable computers go everywhere with their owners, the pad that must be carried from place to place is a failure. Pads are intended to be ‘scrap computers’ (analogous to scrap paper) that can be grabbed and used any-where; they have no individualized identity or importance.” • M. Weiser

  7. Boards • “We have built enough Liveboards to permit casual use: they have been placed in ordinary conference rooms and open areas, and no one need sign up or give advance notice before using them. By building and using these boards, researchers start to experience and so understand a world in which computer interaction casually enhances every room.” • M. Weiser Xerox’s Liveboard with Tivoli application http://www.vaeggen.copenhagen.dk/

  8. Technical challenges • What is different from a PC? • Input • Output • Activities and context

  9. “The future is already here. It is just not uniformly distributed” • —William Gibson

  10. Beyond mouse and keyboard • Mobile devices • Augmented reality • Surface computing • Ambient technologies • Tangibles and Wearables Natural?*

  11. Introduction to socio-technical phenomena

  12. Air traffic control • Let’s make it digital?

  13. Paper flight strips • Are flexible • Allow rule-breaking • Support communication • Do not interrupt

  14. The Myth of the Paperless Office • Why haven’t we thrown out all of our paper? • As information technology has grown, so has the amount of paper • We use paper in conjunction with technology; we’re not replacing it

  15. Pen and Paper Computing • A wide range of technology • tablets • e-ink and flexible displays • Anoto/Livescribe • Stepping back • Situated actionGo beyond planned activities; let users decide how to act in unforeseen circumstances.

  16. Mobile UI • Attention • 1 handed interaction • Limited space • Stepping back: • Rythm and routinesTake advantage of routine activities and spatial patterns to help users integrate the system into their daily lives. Rob Haitani from Palm

  17. Example : Foursquare • User control • Plausible deniability • Gaming • Must handle users’ behavior

  18. Tabletops and multitouch interaction • Slide from N. Rousssel

  19. Interactive surfaces NYU - Perceptive pixel (FTIR) • Focus (+ context?) • Lack of overview • Shared display • Collaboration • Space control • Input management • Stepping back • Distributed cognitionConsider how other people or objects in the world can reduce the cognitive load for memory or communication tasks

  20. Augmented reality • Mixing the digital and the physical • capture • consistency • Stepping back • Embodied interactionReady at handReady to hand • Wellner’s Digital Desk Urp

  21. Tangible interfaces • Ishii’s bottles • Radically new but always familiar • Everyday (no actuation) • Actuated • Ambient • Stepping back • Co-adaptationExpect users to re-interpret and customize technology, help them capture and share those customizations

  22. Ambient technologies • Moving between • background and, • foreground of attention • Stepping back • Peripheral awarenessDesign for center and periphery, Allow users to vary their degree of engagement.

  23. Wearables

  24. November 11-12 • Sketching in hardware • back to the basics with amateur electronics, • links between the digital and the physical.

  25. Implications of ubicomp technologies on interaction

  26. Input-Output loop • Modes: visual, auditory, haptic... • Scale: wearable, mobile, office, city, ambient...

  27. Ubicomp technologies and Collaboration • Most of Ubicomp being distributed it also involves collaboration Wikipedia. Johansen, 1988 in Baecker, R.M.; Others, (1995). Readings in human-computer interaction: toward the year 2000. Morgan Kaufmann Publishers.

  28. Multimodality • Definitions: • Multimodal generally refers to an interface that can accept input from two or more combined modes. • Multimedia generally refers to an interface that produces output in two or more modes. • Multimodal vs. sensor fusion? David West, Aaron Quigley, Judy Kay Memento: A Digital Physical Scrapbook for Memory Sharingin Journal of Personal and Ubiquitous Computing Special

  29. Modalities • Input: • mouse • pen • speech • audio (non-speech) • tangible object manipulation • gaze, posture, body-tracking • Output • Visual displays • Haptics: Force Feedback • Audio • Smell • Taste

  30. Motivations • Hands busy / eyes busy • Mutual disambiguation • Faster input • More “natural” • Anthropomorphism?

  31. Input-Output loop • Modes: visual, auditory, haptic; • Scale: wearable, mobile, office, city, ambient; • How to move from one device to the next? ex: Augmented surfaces Recombinant computing Plasticity

  32. Integration

  33. Questions for Ubicomp technologies

  34. Seamless From M. Weiser, UIST’94Building Invisible Interfaces • Is a seamless building one in which you never notice as you move from place to place? • Making everything the same is easy; • Hard is letting everything be itself, with other things • Goal: seamful systems, with beautiful seams see Chalmers, M. and Maccoll, I. (2003)Seamful and seamless design in ubiquitous computing

  35. Considerations for Ubicomp & calm technologies • Ubicomp is about everyday distributed computation and connectedness. • The original idea was to get calm technology. • Not about fridges that order for you • Not about connecting your iphone to your water faucet

  36. Ubiquitous computing design principles • Bliss • Distraction • Cognitive flow • Manuals • Transparency • Modelessness • Fear of interaction • Notifications see cognitive flow • Calming • Defaults Textbook p.245

  37. Natural User Interfaces?

  38. On intuitive and natural

  39. On learning

  40. Making Sense of Sensing Systems • When I address a system, how does it know I am addressing it? • When I ask a system to do something how do I know it is attending? • When I issue a command (such as save, execute or delete), how does the system know what it relates to? • How do I know the system understands my command and is correctly executing my intended action? • How do I recover from mistakes? Bellotti et al., CHI 2002 from Stanford CS376

  41. Challenges of ubicomp • Challenge One: The “Accidentally” Smart Environment • Challenge Two: Impromptu Interoperability • Challenge Three: No Systems Administrator • Challenge Four: Social Implications of Aware Technologies • Challenge Five: Reliability • Challenge Six: Inference in the Presence of Ambiguity Edwards & Grinter, Ubicomp 2001

  42. Novel Interactions • Let’s get technical

  43. Implications for UI inputs handling • Event-based • Focus on widgets (or interactive elements) Tendency to manage presentation + behavior • State machines • Focus on interactions Ability to handle multiple interactions simultaneously. • Reactive / Flow-based • Cascading reactive devices, input-reconfiguration. Real-time settings

  44. Architecture example I • Event oriented • most GUI • The Event Heap from the iRoom • Paper toolkit [Joahnson & Fox, 2002] [Yeh et al, UIST’08]

  45. Paper Toolkit

  46. Architecture example I • 1 public class SimplePaperApp { • 2 private static int numStrokes = 0; • 3 public static void main(String[] args) { • 4 Application app = new Application(); • 5 Sheet sheet = app.createSheet(8.5, 11); • 6 Region inkReg = sheet.createRegion(1, 1, 4, 4); • 7 Region tapReg = sheet.createRegion(1, 6, 1, 1); • 8 Device remote = app.createRemoteDevice(); • 9 inkReg.addEventHandler( • 10 new InkHandler() { • 11 public void handleInkStroke(InkEvent e) { • 12 numStrokes++; • 13 }}); • 14 tapReg.addEventHandler( • 15 new ClickAdapter() { • 16 public void clicked(PenEvent e) { • 17 remote.displayMessage(numStrokes); • 18 }}); • 19 app.run(); // starts event dispatch loop • 20 } • 21 }

  47. Architecture example II • State machines • Instrumental interaction and state machines • State machines in QT 4.6 • VIGO: Instrumental Interaction in Multi-Surface Environments

  48. Instrumental interaction • Definition: • “An instrument is a mediator or two-way transducer between the user and domain objects. The user acts on the instrument,which transforms the user’s actions into commands affecting relevant target domain objects.”

More Related