1 / 13

A Toolkit for Evaluating Peripheral Awareness Displays

A Toolkit for Evaluating Peripheral Awareness Displays. Jennifer Mankoff Carnegie Mellon. Tara Matthews UC Berkeley. Awareness Workshop CHI 2005. Peripheral Displays. Allow continuous awareness of info while performing another activity

indra
Download Presentation

A Toolkit for Evaluating Peripheral Awareness Displays

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Toolkit for EvaluatingPeripheral Awareness Displays Jennifer Mankoff Carnegie Mellon Tara Matthews UC Berkeley Awareness Workshop CHI 2005

  2. Peripheral Displays • Allow continuous awareness of info while performing another activity • Well suited to helping people maintain awareness of others

  3. PD Evaluation is Difficult “How do you evaluate a peripheral display? You can’t do a typical lab, usability study. We had to implement a working prototype and deploy it in people’s work place. If we had found that it was all wrong, we had to throw away all that work.” “Fundamentally, most technology… is about maximizing efficiency… A lot of ambient display stuff is not about maximizing efficiency…. Paradigms from Jakob Neilson… basically say, how easy is it to do this thing, how efficient is it. And when [efficiency] is not necessarily a fundamental metric, I think you have to reevaluate those systems of evaluation.” “With a successful ambient display – how are you going to prove [it is successful], except if people after two years are still using it.”

  4. Toolkit Support for PD Evaluation • Peripheral Display Toolkit (PTK) • Support for managing user attention in PD development • Enables quick prototype creation • Add support for evaluating PTK prototypes • Automate data gathering in the field • Enable evaluation early in design cycle Design Prototype Evaluate

  5. Toolkit Support for PD Evaluation Measure metrics specific to PDs: • awareness • amount of info displayed that users are able to recall, understand, or use • distraction • amount of attention the display attracts away from a user’s primary action

  6. Methods: Measuring Awareness & Distraction • Context-aware experience sampling • Image experience sampling • Audio experience sampling • Logging of PD task & primary task • Data analysis

  7. Context-Aware Experience Sampling • Gather in situ feedback from users during field studies: ask questions throughout day • Contextual feelings • Knowledge questions • PTK support: • Graphical pop-up questionnaires • Context-aware: asked at key moments, based on knowledge of event contents, notification levels, and interruptibility.

  8. Image & Audio Experience Sampling • Intille et al. [CHI ‘02]: • Image is captured during a field study& later presented to user for reflection • PTK support: • Images & audio • Multiple cameras & mics • Context-aware • Benefits: • Users are not interrupted mid-activity • Anywhere in environment • Moment user sits at computer  in between tasks • a good time for interruptions

  9. Logging • PD task • input, output, notification levels • Primary & displaced tasks • keystrokes • mouse movement & clicks • active application window • Helps explain user behavior • measure changes in behavior • determine affects of PD on primary task

  10. Data Analysis • Were user’s aware of displayed info? • Experience sample awareness question answers:checked for correctness against input logs • Primary task logging: do task switches correspond to newly displayed info? • Were user’s distracted? • Primary task logging: slower keystrokes, pauses in activity • Categorize experience samples and primary task logs by notification level to see if answer/behavior trends emerge

  11. Validation • Iteratively design sound viz for deaf (Ho-Ching, CHI’03) • Conduct in-lab study & long-term field study • Use PTK to measure awareness & distraction • Compare our results with previous study results

  12. Conclusion • Goal: lower cost of PD field studies • Plan: automate data gathering in the field • Focus: user awareness & distraction • Methods: • context-aware experience sampling • image experience sampling • audio experience sampling • logging of PD task & primary task • data analysis • Validate in study on sound visualization for deaf

  13. Questions? For more info: tmatthew@cs.berkeley.edu http://www.eecs.berkeley.edu/~tmatthew/projects/ptk.html

More Related