1 / 36

Deciding How to Measure Usability How to conduct successful user requirements activity?

This comprehensive guide delves into the strategies for measuring usability, from understanding what can be measured to setting quantitative criteria. It explores the significance of matching measures to your usability goals, concerns, and the product's development stage. Learn how to conduct successful user requirements activities and select performance measures effectively. The text covers tools like data logging software and criteria selection, emphasizing the importance of user feedback and streamlined testing processes.

Download Presentation

Deciding How to Measure Usability How to conduct successful user requirements activity?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Deciding How to Measure Usability • How to conduct successful user requirements activity?

  2. Deciding How to Measure Usability • Understanding what you can measure • Matching Measures to your Goals and concerns • Matching Measures to the Product's Stage of Development • Setting Quantitative Criteria for Each measure and each Task

  3. Understanding what you can measure In a usability test you measure both Performance measures and Subjective measures

  4. Understanding what you can measure • Performance Measures: count of behaviors or actions you can see. Quantitative e.g. how many errors people make and how many times they repeat the same error

  5. Understanding what you can measure • Subjective Measures: people's perceptions, opinions and judgments. either quantitative or qualitative e.g. give a people 5 or 7 point scale and ask them to rate difficulty in using a product.

  6. Deciding How to Measure Usability • Understanding what you can measure • Matching Measures to your Goals and concerns • Matching Measures to the Product's Stage of Development • Setting Quantitative Criteria for Each measure and each Task

  7. Deciding How to Measure Usability • Understanding what you can measure • Matching Measures to your Goals and concerns • Matching Measures to the Product's Stage of Development • Setting Quantitative Criteria for Each measure and each Task

  8. Matching Measures to your Goals and Concerns Performance measures chosen should be directly related to quantitative usability goals and concerns behind the usability test How to collect performance data? Data Logging Software

  9. S... Stop the test B Take a Break T... Stop the task A Assist M Menu Error H Help desk S Select from list error F Frustration E Other Error N Observations Event: O 06:21:33 Comment: Looking for online help for To: field

  10. Points to be considered in building Data Logging Program • Include preset codes for events that happen in every test. • Code should be small • Short description with code • Timestamp every event • Allow easy backup and movement of data into a file What if you don't have a data logging software? Should you measure positive behavior?

  11. Deciding How to Measure Usability • Understanding what you can measure • Matching Measures to your Goals and concerns • Matching Measures to the Product's Stage of Development • Setting Quantitative Criteria for Each measure and each Task

  12. Matching Measures to Product's stage of development Very important to consider where product is in the development cycle while planning performance measures. • Testing prototype for a manual (without index)

  13. Deciding How to Measure Usability • Understanding what you can measure • Matching Measures to your Goals and concerns • Matching Measures to the Product's Stage of Development • Setting Quantitative Criteria for Each measure and each Task Choose the performance measures you want to count

  14. Selecting Performance Measures General Concerns: • Ease of users who have never used email • Ease of users who have used other email • Will the online help be useful • Will new user be able to select items from screens quickly and easily

  15. Selecting Performance Measures Specific Concerns: • Will the user be able to read a specific piece of mail and skip over mail they don't want to read • Will new users be able to find and select people's addresses to send them mail • Will users be able to find the right menu path to read/write/send a message

  16. Setting Quantitative measure for Each Measure and Task How to select criteria for performance measures? Typical criteria for Performance measures • Excellent • Acceptable or OK • Unacceptable Base your criteria on users

  17. Setting Quantitative Criteria for each measure and each task

  18. Are the measures the same for all tasks in a given test? No Are the criteria of performance same for all tasks? No

  19. Do you take the test situation into account in selecting criteria? Yes Should you count system response time in setting the criteria? Yes

  20. What we have discussed till now Tasks that participants will do during tests and how to measure participant's performance with products.

  21. Deciding How to Measure Usability • How to conduct successful user requirements activity?

  22. How to conduct successful user requirements activity? • Welcoming your participants • Dealing with late and absent participants • Warm up exercises • Inviting Observers • Introducing your think aloud protocol • Moderating your activity • Recording and note taking • Dealing with awkward situations • Conclusion

  23. Welcoming your participants • Ask participants to come about 15 minutes earlier. • Introduction. • Welcome Signs. • Playing CDs. Don't leave participants alone

  24. Dealing with late and absent participants Despite your best efforts some participants can be late. • The late participant • You can't wait any longer • Including late participants • The No-Show

  25. Warm-up Exercises • Start with light conversation. • Introduce yourself. • Provide nametags Don't spend much time

  26. Inviting Observers • Developer observers get to know user requirements better. • Tell observers to come early and remain quiet. Don't allow managers to observe

  27. Introducing your think aloud Protocol Make participants speak • Their thought process while doing task . • Steps in the task. • Expectations and evaluation statements. Provide some examples to participants e.g. Stapler Observe loudness of participants

  28. Moderating your activity • Have personality • Ask questions • Stay focused • You are not a participant • Keep activity moving • No critiquing • Everyone should participate • No-one should dominate • Practice makes perfect

  29. Take Notes • You get data immediately and can start analysis • Participants feel they are saying important things Problems: • You can get wrapped being a stenographer • If discussion pace is fast all information cannot be captured

  30. Video and Audio Recording • Capture nuances, body language of participants (cant have in notes) • Listen to recording and take notes • Video better than audio recording Problems: • Can make participants uncomfortable

  31. What is the best approach? Combine Video/Audio Recording and Notes

  32. Dealing with awkward Situations Some Uncomfortable situations and ways to deal with them : • Participant Issues • Observers Issues

  33. Participants Issues • Participant is called in middle of test • Participant's cell phone rings continuously • Wrong participant recruited • Participant thinks he is on a job interview • Participant refuses to be videotaped and wants to leave Contd..

  34. Participants Issues • Participant is confrontational with other participants • Participant dominates the group • Participant is not truthful about his identity • Participant refuses to sign consent form • Fire Alarm sounds in middle of test and participant still wants to continue

  35. Product Team/Observers Issues • Team changes the product mid-test • An observer turns on light in control room • Observers talk loudly during an activity

  36. Conclusion To conduct any of your user requirements effectively • Deal with participant arrivals. • Make them think creatively. • Moderating any activity • Instruct your participants to think aloud. • Deal with any awkward situation in testing that may arise.

More Related