690 likes | 973 Views
REMOTE TESTING. 39TUR 2012/2013. Remote Testing. Software for X. Example: A software needs to be tested for the market in the country of X. Possibilities: Invite 10 people from X to the Czech Republic Air tickets, accommodation, visa Not their own environment Go to X
E N D
REMOTE TESTING 39TUR 2012/2013
Software for X • Example: A software needs to be tested for the market in the country of X. • Possibilities: • Invite 10 people from X to the Czech Republic • Air tickets, accommodation, visa • Not their own environment • Go to X • Use a local recruitment agency • Rent a usability lab • Vaccination • Is it always necessary? • Use the remote testing
Traditional Methods vs. Remote Testing • Traditional methods • Participants sit in the lab • Testers physically observe & record • Remote testing • Participants sit in their office/home • Testers observe their screen via a cable & record
Hierarchy of remote testing methods • Classic usability testing • (Up until now) • (Not included in the course) Same place • Remote testing • Teleconferencing • Surveys • On-line evaluation tools • … Different place Same time Different time
Same time, different place • Synchronous • People connected via teleconferencing MODERATOR STAKEHOLDERS PARTICIPANT
Remote Testing • The testers observe the participants remotely • Via telephone • Via videoconferencing • Via screen capturing and streaming software • Could be a combination of a remote desktop (VNC, …) + a screen grabber (Camtasia, …) • Methodology similar to the one of the classic usability tests • Certain differences
Quality Comparison • Selvaraj & Houck-Whitaker: • Remote tests have at least the same effectivenessas the “traditional” methodology • Benefits • Time and costs savings • You and your participants don’t need to spend time traveling • Realistic context of use • You reach people in their own environment (might be also a drawback) • Geographic representation • Different portions of the globe can be covered • Access to professionals • It’s easier to ask a professional earning “over 9000” to take part in this test because it will claim less of their time
Quality Comparison • Limitations • Lack of nonverbal signs • Communication delay • Low resolution of the video, or perhaps no video link at all • No control over the participant’s conditions • To check the software is well installed • To make sure the participant is not being disturbed • The moderator can’t assist the users on-site • The users are on their own using the system • Higher level of the user IT literacy is expected • Can not test with the novice users • Computer security measures
Quality Comparison • Limitations • Will the users trust our application? • People afraid of spyware • Privately owned vs. corporate computers • Will the stakeholders believe that it’s not fake? • There is no observer room
Remote Testing Overview • Very similar to the “classic usability testing” • Define Objectives & Target Audience • Set up Test Scenario • Recruit Test Users • Carry out Tests • Analyze Findings • Design Report & Brief Stakeholders
Remote Testing Overview • What are you testing? • Who are you testing? • Representative Tasks • Within time-limits & user capabilities • In line with test objectives • Methods of data collection • Screen capture • Questionnaires, interviews
Remote Testing: Test preparation • Consult the objectives with the project stakeholders • Develop instructions for the participants • Run pilot test with home users • Apply changes suggested by the results of the pilot test
Remote Testing: Recruitment • Define user profile & recruitment criteria • Set up recruitment screener • Screener can be filled out on the web • Questionnaires Database of potential participants • Selection from the database • Telephone screener • Very low success rate (telephone marketing failure) • Decide on the incentives
Remote Testing: Recruitment • Recruitment channels • Web • Social networks, mailing lists, job portals • Traditional: Newspapers, ads • With a URL to enter • Unlikely to succeed • Recruitment agency • May be important when testing in an unknown market • Perhaps better targeted participants • Web – advanced services • ethnio.com • clicktale.com
ethn.io • Recruiting people directly from a website • Procedure: • Set up a screener at your ethnio.com profile • Set up your website to display the screener • A website visitor will see the screener • If responds, you will be notified immediately • You contact the person by telephone / e-mail Such participants are likelyto be of your target group.
Remote Testing: Recruitment • Specific requirement • The users must be able to install: • The software that is to be tested • The tools used for the test • The task sheet for the participants must be more specific • There is no moderator in their place • Consent solicitation • Can’t physically sign a sheet of paper • By voice, saying “Yes, I agree.” • By clicking “I agree” on the screener form
Remote Testing: Technology to use • Teleconferencing • Skype • Screen capture and streaming • VNC • Remote Desktop in MS Windows
Carry out the Test • During the test: • Confirm user profile eligibility • Ask for permission to record session • Limit moderator intrusion • Encourage thinking aloud • Take notes • Deliver incentive/payment • Have fun
Analysis & Reports • During tests: track all usability issues • After each test: compare notes & analyze • After all tests: summarize patterns & major problems • Set up report & sample videos • Communicate to all stakeholders
Same place, different time • Data are physically acquired • Data are picked up later on • Examples: • Customer satisfaction survey books • Elections • (Geocashing)
Different time, different place • Asynchronous • Passing messages between the testers and the participants • The whole test can take a considerable amount of time due to delays of communication between the testers and the participants • Testers provide instructions • Through a website / e-mail message • Participants provide data • answering a questionnaire • by monitored interaction with the product • The data are aggregated automatically
Different time, different place • Features • Can be done automatically • Good for quantitative data collection • Good when there are lot of participants (25 – 100) • Drawback • We control the conditions even less
Questionnaire-based Testing • Questionnaire • A set of questions • With defined responses ([yes][no], [1][2][3][4][5], …) • Open ended questions • The same questionnaire administered to all participants • Easy to administer • Point to a web form • Send a structured e-mail • Easy to process • Automatic processing of the web forms • Automatic processing of returned e-mails
Questionnaire-based Testing • Not many people respond to questionnaires • Need to “market” the study well • How to aim for specific target group? • Questionnaire should contain some screening questions • Questionnaire contains Screener • Danger of … • … self-selection!
SUMI • Software Usability Measurement Inventory • Measuring software quality from the user’s point of view • “Quality of Use” • Input: • The software or its prototype must exist • 10 users minimum • Output: • Five “grades”: Efficiency, Affect, Helpfulness, Control, Learnability • Based on existing database of gathered questionnaires • Kept by the authors of SUMI
SUMI: Use • How can be used: • Assess new products during product evaluation • Make comparisons between products or versions of products • Set targets for future application developments • Able to test verifiable goals for quality of use • Track achievement of targets during product development • In a quantitative manner • Source: http://www.ucc.ie/hfrg/questionnaires/sumi/whatis.html
SUMI: Scales • Efficiency • Tasks are completed by the user in a direct and timely manner • Affect • How much the product captures emotional responses • Helpfulness • The product seems to assist the user • Control • Users feels that they set the pace, not the product (they are in control) • Learnability • Ease with which the user can learn using the software and/or new features
SUMI: Questionnaire • 50 fixed and predefined questions, such as: • “This software responds too slowly to inputs” • “I would recommend this software to my colleagues” • “The instructions and prompts are helpful” • “I sometimes wonder if I am using the right command” • “I think this software is consistent” • Responses to these questions: • “Yes” • “No” • “Undecided”
SUMI: Processing • The assignment between questions and scales is not disclosed. • SUMI is a commercial service • Know-how of the authors • Procedure: • Participants try the tested system • SUMI questionnaires administered to the participants by testers • Responses to the questionnaires sent to the authors of SUMI • e-mail, web form, … • Testers receive the grades from SUMI • A nominal fee (hundreds USD)
Reference Value SUMI: Example Evaluation
SUMI: Evaluation • The data provided are with respect to the corpus of previously gathered data • The values show the usability of the system compared to the reference score (50 in each scale) • The data can be used to compare two different systems • Better score vs. worse score
SUMI: Evaluation • Enough to provide an unbiased and objective results? • YES • Enough to give insights into particular problems? • NO … we only have 5 numbers as an output • We know nothing about the sources of errors
Automating Usability Testing Usability Testing a prototype or the final application is provided to a set of users and the evaluator collect and analyze usage data can be based on a set of predetermined tasks What can be automated in such method? capture of usage data analysis based on predefined metrics or a model Usability Evaluation of: navigation functionalities Federico M. Facca 43
Capturing Data Information easy to record but difficult to interpret (e.g., keystrokes) meaningful but difficult to label correctly (e.g., when a task can be considered completed?) Method Type: Performance logging (e.g. events and time of occurrence, no evaluator) Remote testing (e.g. assigned task performed by user and monitored by evaluators) 44
Capturing Data – the Web – Server-side Logging Web Server commonly log each user request to the server Available information is: IP address, request time, requested page, referrer We can derive: Number of visitors Breakdown by countries Coverage by robots … 45 Federico M. Facca
Server-side Logging • Pro • huge quantity of easily available data • do not require “ideal” users • Typical questions that we can answer: • “Which contents is interesting?” • “Do people reach all contents?” • “Is all contents necessary?” … which is not the same as: • “Is the navigation good?” • “Does the new design keep people longer on site?” • “Does the new design make people buy more?”
Server-side Logging • Disadvantages: • Highly quantitative method • Almost no data of exact user interaction with the interface
Client-side Logging Dedicated tools and settings The web client must be enhanced to log information on interaction The client pushes information into a repository on the testers’ server Available information is: IP address, request time, requested page, referring page, mouse position on the screen, clicked links, back button… Pro actual data of exact user interaction with the interface session are automatically reconstructed Against: The participant must use this enhanced browser. 48 Federico M. Facca
Tools “Formal” Client Side User Tracking/Analysis Commercial tools ETHNIO (http://www.ethnio.com/) Ulog/Observer (http://www.noldus.com) UserZoom (http://www.userzoom.com) ClickTale (http://www.clicktale.com/) Usabilla (http://www.usabilla.com) Nielsen Eye Tracking (example in the next slides) Other tools (some are a bit old) WebQuilt (http://guir.berkeley.edu/projects/webquilt/) SCONE/TEA (http://www.scone.de/docus.html) NIST WebMetrics (http://zing.ncsl.nist.gov/WebTools/, not only for tracking and relative analysis) 49 Federico M. Facca
Tools “Informal” Client Side Tracking/Analysis Commercial Tools Google Analytics (http://www.google.com/analytics/) Fireclick (http://www.fireclick.com/) SiteCatalyst (http://www.omniture.com/products/web_analytics) Hitslink (http://os.hitslink.com/) Crazy Egg (http://crazyegg.com) nice example Usabilla (http://usabilla.com) …. tons really Free Tools Search with Google, you can find some Server side analysis Again tons of solutions! 50 Federico M. Facca