1 / 9

Leonel Morales - URL Arturo Rivera - UNIS

Evidence of Bias in the Production of User Test Lists by Software Analysts, and Proposed Mitigation Strategy. Leonel Morales - URL Arturo Rivera - UNIS. Context. User tests highly effective for detecting usability issues HCI professionals unavailable

huslu
Download Presentation

Leonel Morales - URL Arturo Rivera - UNIS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evidence of Bias in the Production of User Test Lists by Software Analysts, and Proposed Mitigation Strategy Leonel Morales - URL Arturo Rivera - UNIS

  2. Context • User tests highly effective for detecting usability issues • HCI professionals unavailable • Task list created by developers or analysts • Lists include hints and messages that reduce value of results

  3. First Indications of Bias • Software Engineering II course • Students developed task lists for an application • Students taped the tests and presented them as an assignment • Lists contained terminology and references to controls in the application

  4. Experiment in Industry • Three companies from the Guatemalan Export Software Commission (Sofex) volunteered to participate in study • Basic introduction: 5 e’s • Analysts created and e-mailed lists for a particular application • CRM • Badge issuance • Inventory

  5. Sample Issues Found • Application-specific terminology • “Create personal incident” / “Create company incident” • Forcing application mental model • “Create product family”, “Create sub-family”, “Create sub-sub-family” • Tasks imposed by the system and naturally meaningless to the user • “Mark the checklist” • “Save changes and close”

  6. Sample Issues Found (2) • Task is means to an end • “Print the report” • “Query by cost” • Artificial order of tasks • “1. Insert text ‘xxx’” / “2. Insert text ‘yyy’” / “3. Add background image in folder z:\” • Tasks too generic: no information to perform actual test • “Read messages”, “Send messages”

  7. Other Issues • Analysts tend to think in abstract terms • Task lists require specifics • Analysts have a mental model of the application • They attempt to impose this model on the user • Lists show tasks that should be tested, even if artificial • Change password • Exit the application

  8. Proposed Mitigation Strategy • Develop scenarios, “stories” • Cases that exemplify what users will find in real life • Should be realistic • Contain all the information that the user would have available • Example • Provide a sample badge design • Ask to reproduce design • Issue a badge for Mr. John Doe, DOB June 3, 1975, from Marketing, with employee ID #12345

  9. Future Work • Field-Test the mitigation strategy • Let developers know about the problem • Evaluate the results • Categorize as many list-issues as we can

More Related