420 likes | 591 Views
Lecture 8: An evaluation framework and Observing users Ref: Ch. 11-12. The aims. Explain key evaluation concepts & terms. Describe the evaluation paradigms & techniques used in interaction design.
E N D
Lecture 8:An evaluation frameworkand Observing users Ref: Ch. 11-12
The aims • Explain key evaluation concepts & terms. • Describe the evaluation paradigms & techniques used in interaction design. • Discuss the conceptual, practical and ethical issues that must be considered when planning evaluations. • Introduce the DECIDE framework.
Evaluation paradigm • Any kind of evaluation is guided explicitly or implicitly by a set of beliefs, which are often underpined by theory. These beliefs and the methods associated with them are known as an ‘evaluationparadigm’
User studies • User studies involve looking at how people behave in their natural environments, or in the laboratory, both with old technologies and with new ones.
Four evaluation paradigms • ‘quick and dirty’ • usability testing • field studies • predictive evaluation
Quick and dirty • ‘quick & dirty’ evaluation describes the common practice in which designers informally get feedback from users or consultants to confirm that their ideas are in-line with users’ needs and are liked. • Quick & dirty evaluations are done any time. • The emphasis is on fast input to the design process rather than carefully documented findings.
Usability testing • Usability testing involves recording typical users’ performance on typical tasks in controlled settings. Field observations may also be used. • As the users perform these tasks they are watched & recorded on video & their key presses are logged. • This data is used to calculate performance times, identify errors & help explain why the users did what they did. • User satisfaction questionnaires & interviews are used to elicit users’ opinions.
Field studies • Field studies are done in natural settings • The aim is to understand what users do naturally and how technology impacts them. • In product design field studies can be used to:- identify opportunities for new technology- determine design requirements - decide how best to introduce new technology- evaluate technology in use.
Predictive evaluation • Experts apply their knowledge of typical users, often guided by heuristics, to predict usability problems. • Another approach involves theoretically based models. • A key feature of predictive evaluation is that users need not be present • Relatively quick & inexpensive
Overview of techniques • observing users, • asking users’ their opinions, • asking experts’ their opinions, • testing users’ performance • modeling users’ task performance
Asking experts User Testing Modeling user’s task performance
DECIDE: A framework to guide evaluation • Determine the goals the evaluation addresses. • Explore the specific questions to be answered. • Choose the evaluationparadigm and techniques to answer the questions. • Identify the practical issues. • Decide how to deal with the ethical issues. • Evaluate, interpret and present the data.
Determine the goals • What are the high-level goals of the evaluation? • Who wants it and why? • The goals influence the paradigm for the study • Some examples of goals: • Identify the best metaphor on which to base the design. • Check to ensure that the final interface is consistent. • Investigate how technology affects working practices. • Improve the usability of an existing product .
Explore the questions • All evaluations need goals & questions to guide them so time is not wasted on ill-defined studies. • For example, the goal of finding out why many customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into sub-questions:- What are customers’ attitudes to these new tickets?- Are they concerned about security?- Is the interface for obtaining them poor? • What questions might you ask about the design of a cell phone?
Choose the evaluation paradigm & techniques • The evaluation paradigm strongly influences the techniques used, how data is analyzed and presented. • E.g. field studies do not involve testing or modeling
Identify practical issues For example, how to: • select users • stay on budget • staying on schedule • find evaluators • select equipment
Decide on ethical issues • Develop an informed consent form • Participants have a right to:- know the goals of the study- what will happen to the findings- privacy of personal information- not to be quoted without their agreement - leave when they wish - be treated politely
Evaluate, interpret & presentdata • How data is analyzed & presented depends on the paradigm and techniques used. • The following also need to be considered:- Reliability: can the study be replicated?- Validity: is it measuring what you thought?- Biases: is the process creating biases?- Scope: can the findings be generalized?- Ecological validity: is the environment of the study influencing it - e.g. Hawthorn effect
Pilot studies • A small trial run of the main study. • The aim is to make sure your plan is viable. • Pilot studies check:- that you can conduct the procedure- that interview scripts, questionnaires, experiments, etc. work appropriately • It’s worth doing several to iron out problems before doing the main study. • Ask colleagues if you can’t spare real users.
Key points • An evaluation paradigm is an approach that is influenced by particular theories and philosophies. • Five categories of techniques were identified: observing users, asking users, asking experts, user testing, modeling users. • The DECIDE framework has six parts: - Determine the overall goals - Explore the questions that satisfy the goals - Choose the paradigm and techniques - Identify the practical issues - Decide on the ethical issues - Evaluate ways to analyze & present data • Do a pilot study
The aims • Discuss the benefits & challenges of different types of observation. • Describe how to observe as an on-looker, a participant, & an ethnographer. • Discuss how to collect, analyze & present observational data. • Examine think-aloud, diary studies & logging. • Provide you with experience in doing observation and critiquing observation studies.
What and when to observe • What: • Goals & questions determine the paradigms and techniques used. • Observation is valuable any time during design. • Observers: • outsiders looking on • participants, i.e., participant observers • ethnographers • When: • Any time during product development
Approaches to observation • Quick & dirty observations early in design • Observation in the field (i.e., field studies) • Observation in controlled environments (i.e., usability studies)
Frameworks to guide observation in the field • - The person. Who? - The place. Where?- The thing. What? • The Goetz and LeCompte (1984) framework:- Who is present? - What is their role? - What is happening? - When does the activity occur?- Where is it happening? - Why is it happening? - How is the activity organized?
The Robinson (1993) framework • Space. What is the physical space like? • Actors. Who is involved? • Activities. What are they doing? • Objects. What objects are present? • Acts. What are individuals doing? • Events. What kind of event is it? • Goals. What do they to accomplish? • Feelings. What is the mood of the group and of individuals?
Before going into the field, you need to consider • Goals & questions • Which framework & techniques • How to collect data and record events • Which equipment to use • How to gain acceptance and trust • How to handle sensitive issues • Whether and how to involve informants • How to analyze the data, your notes • Consider working as a team • Plan to look at from different perspectives
Observing as an outsider • As in usability testing • More objective than participant observation • In usability lab equipment is in place • Recording is continuous • Analysis & observation almost simultaneous • Care needed to avoid drowning in data • Analysis can be coarse or fine grained • Video clips can be powerful for telling story • (Pang2 Guan1 Zhe3 Qing1)
Participant observation & ethnography • Debate about differences • Participant observation is key component of ethnography • Must get co-operation of people observed • Informants are useful • Data analysis is continuous • Interpretivist technique • Questions get refined as understanding grows • Reports usually contain examples
Data collection techniques • Notes & still camera • flexible • Difficult to observe and write at the same time • Audio & still camera • Less intrusive than video • Mobile • Flexible • Lack of visual record • Onerous to transcribe the data • Video • Both visual and audio data • Intrusive • Attention focus on the screened site • Data analysis is time-consuming (100 times)
Tracking users • Diaries • When • What • What they thought • Advantage • When users are not easy to reach • Disadvantage • Rely on the participants • Only remember the events better or worse than the real • Interaction logging • Key presses • Mouse • Other device movements • Time-stamped • Advantages: • Unobtrusive • Large volumes of data • Disadvantages: • May still raise ethical concerns • Need powerful tools
Data analysis • Qualitativedata - interpreted & used to tell the ‘story’ about what was observed. • Qualitative data - categorized using techniques such as content analysis. • Quantitative data - collected from interaction & video logs. Presented as values, tables, charts, graphs and treated statistically.
Main activities of quali-analysis to tell a story • Review the data • Record the themes in a coherent yet flexible form. • Record the date and time of each data analysis session • Check your understanding with the people you observe • Iterate the above processes • Report your findings
Interpretive data analysis • Look for key events that drive the group’s activity • Look for patterns of behavior • Test data sources against each other • Report findings in a convincing and honest way • Produce ‘rich’ or ‘thick descriptions’ • Include quotes, pictures, and anecdotes • Software tools can be useful e.g., NUDIST
Quali-analysis for categorization • Look for incidents or patterns • Obviously stuck • Comment • Silence • puzzlement • Analyze data into categories • Content analysis: a fine grain way of video analysis • Key point: do not overlap each other in any way • Analyzing discourse • Conversation analysis • Focus on dialogues • Pay more on contexts • Fine-grain
Key points • Observe from outside or as a participant • Analyzing video and data logs can be time-consuming. • In participant observation collections of comments, incidents, and artifacts are made. Ethnography is a philosophy with a set of techniques that include participant observation and interviews. • Ethnographers immerse themselves in the culture that they study.