330 likes | 531 Views
CS298: HCI Design Clinics. Wizard of Oz Prototyping 03/ 08/10 Dr. Steven Dow. Berkeley. university of california. Why Do We Prototype?. Design. Prototype. Evaluate. Why Do We Prototype?. Get feedback on our design faster
E N D
CS298: HCI Design Clinics Wizard of Oz Prototyping 03/08/10 Dr. Steven Dow Berkeley university of california
Design Prototype Evaluate Why Do We Prototype? Get feedback on our design faster Experiment with alternative designs Fix problems before code is written Keep the design centered on the user
Design Prototype Evaluate Why Do We Prototype? Get feedback on our design faster Experiment with alternative designs Fix problems before code is written Keep the design centered on the user • Communicate with otherdesigners, engineers, and clients • Build confidence in solutions • Stay motivated to persevere
Prototyping tip #1: Avoid over-investing in prototypes • Sunk-cost reasoning makes design changes difficult
Prototyping tip #1: Avoid over-investing in prototypes • Sunk-cost reasoning makes design changes difficult • You buy advance tickets for Wicked the Musical for $50 each • Later you find out the show has received mixed reviews • On the day of the show it’s raining and you have a headache • What would you rather do? • A) Go see the show, even though you may not enjoy it • B) Skip the show, stay home, and watch Wizard of Oz
Topics • Intro to Wizard of Oz prototyping • Exercise: wizarding a wake-up service • Dimensions of WOz prototyping • Exercise: wizarding a context-aware application
Wizard of Oz (WOz) methods in HCI • Make an application operate without (much) code • Must take less time/money than building the real thing • Get feedback on the user interface (fidelity matters) • Hi-fidelity interfaces can make users think it’s working • Low-fidelity gives users more license to suggest changes • Interface is just a façade for a working application • Fake the interaction (like paper prototyping, but the digital interface appears “real”) • An operator uses a separate (sometimes remote) control interface to facilitate user interaction • Iterate, iterate, iterate
Examples of WOz methods • Natural language dialog – Travel Agent example • Dahlbäck,Jönsson,Ahrenberg, 1993
Examples of WOz methods • Speech User Interfaces – SUEDE • Klemmer, Sinha, Landay, 2000
How to make a WOz Prototype Put together an interface “skeleton” • Invest minimal time • Create “hooks” in the code • Where and how the wizard will provide input (e.g., selecting the next screen, entering text, entering a zone, recognizing speech, etc.); Must be possible to replace later with computer Map out scenarios and application flow • Plan out what should happen in response to any user reaction Rehearse wizard functions with a colleague as user • Make sure the wizard can perform the task
Prototyping tip #2: Manage users to get desired feedback Users can respond very differently depending on how the study is conducted
Prototyping tip #2: Manage users to get desired feedback • Users can respond very differently depending on how the study is conducted • Experiment on multiple UI designs: Users provide more “damaging” and valuable feedback when given alternatives Tohidi, Buxton, Baecker, Sellen, 2006
How to test a WOz Prototype What you tell the user matters • Typically they are not told about the human operator(more authentic, but considered unethical in some situations) • Various evaluation methods may work • -Think aloud (speak freely as performing tasks)-Retrospective (best when think aloud takes away from app)-Heuristic evaluation (works with experts too) Give users tasks or a clear indication of what they should be doing • Hand-written tasks on paper • Debrief users, reveal wizard
Wake-up Call Service Objective • Create and test a service that allows hotel visitors to set/adjust/cancel a wake up call Rules • Form teams of 2 people • Map out dialog tree and output statements • Handle all possible scenarios • Pull 2 people from other teams to “test” the service • Revise and fix any problems that are discovered • 20 minutes to write the script, 10 minutes to test with users
Mechanical Turk • 18th Century Chess Playing Machine
Mechanical Turk • Leveraging human intelligence
Wizard design choices (dimensions) • Exposure of Wizard • What does the user know about the hidden wizard? • Task • What apart of an application does a wizard emulate? • Responsibility • How much should the system rely on the wizard? • Number of wizards • Can the wizard task be distributed across multiple operators? • Expertise of wizards • What does the wizard need to know? Can they be amateurs? • Location of wizards • Are the wizards co-located or remote? • Timing and setting • When and where are wizard of oz methods appropriate?
Prototyping tip #3: Get the design right and the right design
Prototyping tip #3: Get the design right and the right design Dow, MacIntyre, Lee, Oezbek, Bolter and Gandy, 2005
Prototyping tip #3: Get the design right and the right design • Wizard methods can facilitate development throughout a user-centered design process Dow, MacIntyre, Lee, Oezbek, Bolter and Gandy, 2005
Tradeoffs in Wizard of Oz prototyping • Advantages • Disadvantages
Tradeoffs in Wizard of Oz prototyping • Advantages • Fast (faster) and thus, cheaper and more iterative prototypes • More “real” than paper prototyping • Identifies bugs and problems with current design • Places the user at the center of development • Can envision challenging-to-build applications • Designers learn by playing the wizard • Disadvantages • Simulations may misrepresent otherwise imperfect tech • May simulate technologies that do not exist (and may never) • Wizards may need to be trained and can be inconsistent • Playing the wizard can be exhausting • Some system features (and limitations) cannot be simulated • May not be appropriate in certain venues (e.g., home)
Summary of Wizard of Oz Methods • Use “Wizard of Oz” prototypes to design (and get feedback!) throughout a development process • Learn what users like/dislike, discover problems, and continually evolve the application • Transition towards an application that is fully-functional and user-centered (adjusting the wizard roles and configuration as needed) • Account for potential downsides: inconsistent wizard performance, mismatch of simulation and actual technology, inappropriate for some testing environments…etc.
What is “context-aware”? System senses something about users (or the environment) • Location – where is the user? • Identity – who is the user? • Proximity – who or what is near the user? • Activity – what is the user doing? • Situational – what’s happening in the user’s environment? • Multi-modal – how is the user interacting with the system? System then reacts to or prompts the user, or acts on the user’s behalf How is context sensed? • Cameras with image processing • Accelerometers for motion sensing, rotation • GPS and other indoor location sensors • Light/heat sensors and other environmental sensors • Microphones for sounds; smell sensors; web sensors; etc.
WOz methods for context • Cook’s Collage – Kitchen memory aid Tran, Calcaterra, Mynatt, 2005
WOz methods for context • Topiary – Location-based applications Li, Hong, Landay, 2004
Create a context-aware application Objective • Create and test an application with paper or digital screens that reacts to users using various sensors Rules • Form teams of 3-4 people • Choose a context and sketch scenarios • Map out application flow, create screens to respond to events • Pull people from other teams to “test” the service • 30 min to write application, create screenshots • 10 min per group to test • Show and Tell at 5:40 (you must justify your “sensors”)