1 / 45

Some Usability Engineering Methods

Dive into the world of usability engineering with discussions on good and bad web designs, heuristic evaluations, and end-user testing. Learn how to apply empirical, iterative, user-centered design principles to enhance product development. Discover the methods used, such as remote testing, walkthroughs, and heuristic evaluations, to optimize user experiences. With insights on when and why to employ these methods, this guide empowers you to advocate for users and create intuitive interfaces.

banksl
Download Presentation

Some Usability Engineering Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Some Usability Engineering Methods Randolph Bias 10/8/2012

  2. Today • Divide the day into thirds: • RB on usability evaluation methods • Discussion of your good and bad web designs • HE exercise

  3. Remember . . . • Our approach, within usability engineering of web sites and other user interfaces, is: • Empirical • Iterative • User-Centered Design

  4. The Methods of Usability Engineering . . . • Are employed in order to enable you to bring user data (empiricism) to bear on the emerging product design. • You (the usability engineer) become an advocate for the user, in the product development process.

  5. One big problem • Cost

  6. Three Methods to address cost • Remote End-user testing (lab testing) • Heuristic Evaluations • Usability Walkthroughs Today let’s talk about WHY and WHEN we employ one method or another, and HOW to carry them out.

  7. Gold Standard: End-user Testing • Also called “lab testing” • Can be done on paper-and-pencil design, prototype, early code, existing product

  8. EUT - Benefits • Gather performance and satisfaction data • Performance data: time on task, error rates, # calls to the help desk, # references to the documentation, . . . . • Satisfaction data: End-of-test questionnaire • Can be find-and-fix or benchmarking • Ensure coverage of certain parts of the UI – we have good control over the tasks

  9. EUT - Limitations • Artificial situation • Successful test doesn’t “prove” the product works • Aside – It’s ALL about confidence. • Need representative users! • Ease of learning vs. ease of use • Hard to test longitudinally

  10. EUT -- What to test? • Can rarely cover all the UI. • I like to test: • critical tasks • frequent tasks • nettlesome tasks

  11. Rubin’s 4 Types • Exploratory (working on the “skeleton” – maybe the Information Architecture) • Assessment test (working on the “meat and flesh” of the UI) • Validation test (does it meet the objectives?) • Comparison test (compare two competing designs) • Rubin, J. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Wiley: New York, NY, 1994. (Superseded by Rubin and Chisnell.)

  12. Set up • Create environment (ambient setting, HW, SW) • Identify participants • Establish test roles (test monitor/administrator, data logger, timer, video operator, product experts (SMEs), other observers) • Create test plan • Prepare observers • Prepare test materials

  13. What materials? • Instructions • Informed consent form • NDA • Permission to videotape • Test scenarios • Questionnaire(s)

  14. Conduct the Test • Welcome the test participant • Communicate that this is not a test of THEM, and they can leave any time • The scenarios should match the real world setting • Ask the test participants to “think aloud” to better understand intent • Offer post-test questionnaire, and debrief

  15. After the Test • Quick data to product team • Assign severities and build recommendations • Build archival report • Serve as change agents!

  16. Ah, but REMOTE • Saves tons of travel money • Allows you to get otherwise hard-to-get test participants. • Allows them to be in their own environments. • Might allow product designers/developers to watch from their own office. • But . . . lose some fidelity of the test environment (video?) • Some added set-up cost (time)

  17. What is a Heuristic Evaluation? Evaluators systematically inspect the application interface to check for compliance with recognized usability guidelines (heuristics). (Thus, an INSPECTION method.) Identifies major and minor usability problems - Conducted by three to five experienced usability engineers (or one!) - Problems are reported along with the heuristic it violates

  18. Problems Identified The probability of one evaluator finding . . . A major usability problem - 42% * A minor usability problem - 32% More evaluators = more problems identified *from www.useit.com

  19. Strengths of an HE Done by people experienced in usability not just “dumb” users Can identify both major and minor usability problems Can be done relatively quickly and inexpensively UNlike EUT, can sometimes cover every corner of a UI or web site

  20. Weaknesses of an HE If done at end of design, designers may be resistant to changes Some designers/developers may be unmoved by “just opinions” Experienced usability evaluators may miss content problems that actual users would find Can HELP address this issue by using SMEs

  21. Typical Methodology Interface is exercised -1st pass to develop the big picture -2nd pass to accomplish typical tasks Each problem is reported along with the heuristic it violates Comments are consolidated Severity levels – Critical, Major, Moderate, Minor

  22. Nielsen’s Usability Heuristics • Visibility of system status • The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. • Match between system and the real world • The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. • User control and freedom • Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. • Consistency and standards • Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. • Error prevention • Even better than good error messages is a careful design which prevents a problem from occurring in the first place.

  23. Nielsen’s Heuristics (cont’d.) • Recognition rather than recall • Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. • Flexibility and efficiency of use • Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. • Aesthetic and minimalist design • Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. • Help users recognize, diagnose, and recover from errors • Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. • Help and documentation • Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.

  24. Critical Usability Issue: • Loss of user data • System shutdown • Abandoned task • Major Usability Issue – • Completed task but considerable frustration or extra steps Severity Levels Used Moderate Usability Issue – - Moderate work around or multiple attempts * Usability Suggestion

  25. TYPE CRITICAL MAJOR MINOR Installation 0 0 3 Project Creation 0 0 5 Visual Layout 0 3 16 Code Editor 0 1 1 Run Project 0 0 1 Debug 0 0 1 Main IDE 0 0 7 Icons 0 0 2 Help 0 0 2 Totals 0 4 38 Sample Summary of Results

  26. How to . . . • http://www.useit.com/papers/heuristic/heuristic_evaluation.html

  27. Other Heuristics • http://www.stcsig.org/usability/topics/articles/he-checklist.html

  28. Welcome . . . • . . . to today’s Usability Walkthrough. • You are here to evaluate the Zara Tours web site. • This is a test of the site, not of you.

  29. Usability Walkthroughs Purpose • To collect user data -- from multiple users at one time -- to help drive the design of a user interface

  30. Context • This is just one step in a series of “User-Centered Design” methods we’re employing: • Heuristic evaluation (professional judgment) • Usability walkthrough (YOU ARE HERE) • End-user testing in the lab

  31. Key Characteristics • Three types of people in the room • We’re going to go through some task flows, as a group. • You’ll have hard-copy packets, where you will write the actions you would take, if you were online, carrying out a certain, prescribed task. • No discussion until all have written a response. • We’ll announce the “correct” action (according to the current design). • We’ll discuss the page -- representative users first.

  32. Please note . . . • Design • is • HARD • SW development teams budget time to debug the code; we’re debugging the design!

  33. This is a test of the DESIGN! • Not a test of YOU. • If you have trouble finding the “right” answers, then WE have a problem with the UI. • The site design team is being very bold, to expose their design to users early like this -- but they’re doing it because they realize the benefit, and they want to get it right!

  34. And so . . . • Given the input we hear today, the site design team (assisted by a usability professional) will redesign the interface.

  35. The Flow of the Day • We’ll hand you a piece of paper with a scenario description on it. • We’ll hand you a packet of screen shots, in order. DO NOT LOOK AHEAD, please. • We’ll ask you to write down on the page what action you’d take, to accomplish the task in the scenario.

  36. The Flow (cont’d.) • We’ll announce the “right” answer. • I’ll ask you to indicate if you got it right. When you did not, I’ll welcome discussion. • I’ll welcome the designers and developers to jump in, as discussion winds down. • I’ll let SOME redesign go on, real-time. • Write more comments under the screen.

  37. Yet more flow . . . • Then you’ll be asked to turn the page. • Now, you’ll have to “reset,” and assume you got to this new page (somehow). • Then, what action would you take on THIS page, while still trying to accomplish the task? • After a scenario, we’ll give you a questionnaire to complete.

  38. Benefits • Lot of data early in the design cycle. • Usability of individual screens, terminology, SOME task flow. • Collaborative redesign on the fly.

  39. Limitations • Can’t get some data (e.g., time on task). • You can’t browse, as you might online -- tendency to “lose your place.” • Feel free to turn BACK in your packet, but not ahead.

  40. Any questions? • We’ll try to finish up by 2:30 or so. • Then have some more informal time to discuss the interface. • We may get more informal at points. • THANK YOU very much for being here. • Absolutely NO ONE should be embarrassed -- not the users, not the developers.

  41. OK, first task: • Task 1: Find how many days it takes to make the trek up Mt. Kilimanjaro via the Machame Route.

  42. Which, Why, When?

  43. Check it out • This is a pretty good representation: http://www.youtube.com/watch?v=O94kYyzqvTc&feature=related

  44. The Wiki • http://courses.ischool.utexas.edu/rbias/wiki/ • Username: seven • Password: plusorminustwo

  45. Next Week • Book reviews • Remember – 4-minute presentations • If you wish to have a ppt accompany your presentation, please send it to Dan by noon, Friday. • It’s gonna be like we read 18 books!! • White papers due in 2 weeks. • Test plans due in 4 weeks.

More Related