380 likes | 400 Views
Learn about conducting user interviews, developing questionnaires, analyzing data, and engaging experts to enhance software usability. Explore various tools and methods to gather valuable insights for designing user-friendly products.
E N D
Ch 13. Asking Users & Experts Team 3: Jessica Herron Lauren Sullivan Chris Moore Steven Pautz
Asking Users and Experts: Introduction • We want to find out what users do, want to do, like, or don’t like • Several tools & techniques are available • Interviews • Questionnaires • Heuristic Evaluation • Cognitive Walkthroughs • Strengths/Weaknesses of these tools
Interview Basics • Remember the DECIDE framework • Determine Goals, Explore Questions, Choose eval paradigm, Identify issues, Decide on ethical issues, and Evaluate data. • Goals and questions guide all interviews • Two Types of Questions:
Avoid Interview Pitfalls • Long questions • Compound sentences - split into two • Jargon & language that the interviewee may not understand • Leading questions that make assumptions e.g., why do you like …? They might not actually like the product. • Be aware of unconscious biases • e.g., gender stereotypes
The Interview Process • Dress in a similar way to participants • Check recording equipment in advance • Devise a system for coding names of participants to preserve confidentiality. • Be pleasant • Ask participants to complete an informed consent form
Group interviews • Also known as focus groups • Typically 3-10 participants • Provide a diverse range of opinions • Need to be managed to:- ensure everyone contributes- discussion isn’t dominated by one person- the agenda of topics is covered, eliminate digressions.
Other Sources • Telephone Interview • Can’t see participant’s body language or nonverbal clues. • Online Interviews • Asynchronous: E-mail or Synchronous: Chat Room • Retrospective Interview • Check with the participant to be sure interviewer correctly understood their remarks.
Analyzing interview data • Depends on the type of interview • Structured interviews can be analyzed like questionnaires • Unstructured interviews generate data like that from participant observation • It is best to analyze unstructured interviews as soon as possible to identify topics and themes from the data
Links • UPA – (Usability Professionals Association) • Discovering User Needs Discovering User Needs • Sage Services - User Research & Usability Techniques Available Through Sage.
Questionnaires • Questions can be closed or open • Closed questions are easiest to analyze, and may be done by computer • Can be administered to large populations • Paper, email & the web used for dissemination • Advantage of electronic questionnaires is that data goes into a data base & is easy to analyze • Sampling can be a problem when the size of a population is unknown as is common online
Questionnaire Style • Questionnaire format can include:- ‘yes’, ‘no’ checkboxes- checkboxes that offer many options- Likert rating scales- semantic scales- open-ended responses
Designing Questionnaires • Make questions clear and specific. • When possible, ask closed questions and offer a range of answers. • Consider including a ‘no-opinion’ option for questions that seeks opinions. • Think about the ordering of questions. • Avoid complex multiple questions.
Designing Questionnaires • When scales are used, make sure the range is appropriate and does not overlap. • Make sure the ordering of scales is intuitive and consistent. • Avoid jargon. • Provide clear instructions on how to complete the questionnaire. • Make a balance between white space and the need to keep the questionnaire compact.
Encouraging a good response • Ensure the questionnaire is well designed so that participants do not get annoyed. • Provide a short overview section. • Include a stamped, self-addressed envelope. • Contact respondents through a follow-up letter. • Offer incentives. • Explain why you need the questionnaire to be completed.
Online Questionnaires • Advantages: • Responses received quickly. • Copying and postage costs lower. • Data can be transferred immediately to database. • Time required is reduced. • Errors in questionnaire can be corrected easily.
Online Questionnaires • Disadvantages: • Sampling is problematic if population size is unknown. • Preventing individuals from responding more than once. • Individuals have also been known to change questions in email questionnaires
Developing Web-based Questionnaire • Steps: • Produce an error-free interactive electronic version from the original paper-based one. • Make the questionnaire accessible from all common browsers and readable for different sized monitors and network locations. • Make sure information identifying each respondent will be captured and stored confidentially. • User-test the survey with pilot studies before distributing.
Asking experts • Experts use their knowledge of users & technology to review software usability • Expert critiques (crits) can be formal or informal reports • Heuristic evaluation is a review guided by a set of heuristics • Walkthroughs involve stepping through a pre-planned scenario noting potential problems
Heuristic evaluation • Developed by Jacob Nielsen in the early 1990s • Based on heuristics distilled from an empirical analysis of 249 usability problems • These heuristics have been revised for current technology, e.g., HOMERUN for web • Heuristics still needed for mobile devices, wearables, virtual worlds, etc. • Design guidelines form a basis for developing heuristics
Nielsen’s heuristics • Visibility of system status • Match between system and real world • User control and freedom • Consistency and standards • Help users recognize, diagnose, recover from errors • Error prevention • Recognition rather than recall • Flexibility and efficiency of use • Aesthetic and minimalist design • Help and documentation
HOMERUN • High-quality content • Often updated • Minimal download time • Ease of use • Relevantto users’ needs • Unique to the online medium • Netcentric corporate culture
Discount evaluation • Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used. • Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.
3 stages for doing heuristic evaluation • Briefing session to tell experts what to do • Evaluation period of 1-2 hours in which:- Each expert works separately- Take one pass to get a feel for the product- Take a second pass to focus on specific features • Debriefing session in which experts work together to prioritize problems
Heuristic evaluation of websites • MEDLINEplus – medical information website • Keith Cogdill, commissioned by NLM, evaluated this website • Based evaluation on the following: • Internal consistency • Simple dialog • Shorcuts • Minimizing the user’s memory load • Preventing errors • Feedback • Internal locus of control
MEDLINEplus suggestions for improvement • Arrangement of health topics • Arrange topics alphabetically as well as in categories • Depth of navigation menu • Having a higher “fan-out” in the navigation would enhance usability…meaning many short menus rather than a few deep ones.
Turning guidelines into heuristics for the web • Navigation, access and information design are the three categories for evaluation • Navigation • Avoidance of orphan pages • Avoidance long pages with excessive white space (scrolling problems) • Provide navigation support • Avoidance of narrow, deep, hierarchical meus • Avoidance of non-standard link colors • Provide a consistent look and feel
Turning guidelines into heuristics for the web (cont.) • Access • Avoidance of complex URLs • Avoid long download times that annoy users • Pictures • .wav files • Large software files • Information Design • Have good graphical design • No outdated or incomplete information • Avoid excessive color usage • Avoid gratuitous use of graphics and animation • Be consistent
Advantages and problems • Few ethical & practical issues to consider • Can be difficult & expensive to find experts • Best experts have knowledge of application domain & users • Biggest problems- important problems may get missed- many trivial problems are often identified
Walkthroughs • Alternative approach to heuristic evaluation • Expert walks through task, noting problems in usability • Users are usually not involved • Several forms: • Cognitive Walkthrough • Pluralistic Walkthrough
Cognitive Walkthroughs • Involves simulating the user’s problem-solving process at each step in the task • Focus on ease of learning • Examines user problems in detail, without needing users or a working prototype • Can be very time-consuming and laborious • Somewhat narrow focus • Adaptations may offer improvements
Cognitive Walkthrough Steps • One or more experts walk through the design description or prototype with a specific task or scenario • For each task, they answer three questions: • Will users know what to do? • Will users see how to do it? • Will users understand from feedback whether the action was correct or not?
Cognitive Walkthrough Steps • A record of critical information is compiled • Assumptions about what would cause problems and why, including explanations for why users would face difficulties • Notes about side issues and design changes • Summary of results • Design is revised to fix problems presented
Pluralistic Walkthroughs • Variation of Cognitive Walkthrough • Users, developers, and experts work together to step through a scenario of tasks • Strong focus on users’ tasks • Provides quantitative data • May be limited by scheduling difficulties and time constraints
Pluralistic Walkthrough Steps • Scenarios are developed in the form of a series of screens • Panel members individually determine the sequence of actions they would use to move between screens • Everyone discusses their sequence with the rest of the panel • The panel moves to the next scenario
Asking Users and Experts: Summary • Various techniques exist for getting user opinions • Structured, unstructured, and semi-structured • Interviews, focus groups, questionnaires • Expert evaluations: heuristic evaluation and walkthroughs • Various techniques are often combined for added accuracy and usefulness
Interview: Jakob Nielsen • “Father of Web Usability” • Pioneered heuristic evaluation • Demonstrated cost-effective methods for usability testing & engineering • Author of several acclaimed design books • www.useit.com • www.nngroup.com