200 likes | 201 Views
This study evaluates the use of eye-tracking technology to assess questionnaire visibility and the effectiveness of key elements such as routing instructions, reminder bubbles, and alpha-numeric boxes.
E N D
What the eye doesn’t see: Evaluating a paper based questionnaire using eye-tracking technology Lyn Potaka Statistics NZ
Introduction • Eye tracking technology potential tool for questionnaire evaluation • Primarily used for web development • Potentially useful for paper questionnaire development (Redline & Lankford, 2001) • Feasibility study in NZ context
Eye-tracking study • Small scale study due to limited funding • NZ Census (2006) project • In collaboration with Access Testing Centre (Australia)
How the technology works • Infra-red light reflecting off the eye illuminates areas of the retina important to vision • Camera captures eye movements • Can then map the points at which the eye is resting on the questionnaire through the use of a computer
Key objectives • Primary objective: • To assess eye-tracking as a tool for questionnaire evaluation • Secondary objectives: • Evaluate the visibility of key elements on the form • In particular – routing instructions, reminder bubbles and alpha-numeric boxes
Routing instructions • Bracketed response options with single routing instruction • Shorter line lengths • Concerns re errors of commission
Reminder bubbles • Bubbles to remind respondents to mark correctly, or look for more information • Bubbles appearing outside of main navigational path
Alpha-numeric boxes • Concerns that boxes would prevent respondents from seeing options appearing underneath • Two versions tested (right aligned boxes & indented boxes)
Method • 16 respondent interviewed: • New Zealand residents • Split of male and female • Aged 18 – 55 years • Half hour interviews • 4 page Census questionnaire (47 questions)
Findings: General observations • Respondents typically observed information presented in the banner but didn’t dwell there • Respondents spent less time looking at questions in lower right regions of form • Respondents didn’t always read all of the information presented before answering questions
Findings: Routing instructions • No errors of omission observed • Some errors of commission recorded • Some respondents making errors of commission had observed the routing instruction but did not skip • Suggests respondents who do not act on routing instructions immediately will often fail to recall them • Indicated individual routing instructions at the end of each response option would be better design
Findings: Reminder bubbles • Bubbles were often missed • Some bubbles were more likely to be missed than others • Characteristics of questions may have impacted (eg. position on page / complexity of question) • Indicated bubbles should be used for non-essential information
Findings: Alpha-numeric boxes • Respondents sometimes failed to observe options which appeared below the alpha-numeric boxes • This occurred for both versions of the questionnaire • Respondents less likely to miss options if they were actively seeking out an answer • Indicated alpha-boxes would pose a greater risk for particular question types
What did we learn? • Study confirmed the importance and impact of visual design on data quality • Supported existing knowledge and research on visual design • Small numbers limited the conclusions • Not appropriate to compare formats • Further work required to identify question characteristics most likely to influence results
Disadvantages • Required quite a lot of time (large amount of data to integrate and analyse) • Dependent on expertise and knowledge of technology specialists • Cost (?) • Technology had limitations (eg. data loss when respondents turned the page or leaned in too close)
Advantages • Dwell times and navigational patterns helped to identify difficult questions • Provided objective measure / convincing for clients • Gave indications on ‘why’ mistakes were occurring (eg. routing errors) • Helped us to identify improvements (eg. position of routing instructions)
What did we conclude? • Useful tool for the design of paper questionnaires • Individual Projects (which questions being read, which instructions being missed, etc) • Potential to expand questionnaire design knowledge generally (eg. characteristics of visual design that work best) • Provides additional information to complement other evaluation strategies
What would we do differently? • Consider analysis carefully before beginning to maximise learning • Consider sample carefully (number and key characteristics required) • Allow more time
Planned Research • Analysis of ONS Census forms • Using more advanced technology • Building on Stats NZ project to look at specific question characteristics that may impact on results