1 / 12

Evaluation Paradigms & Techniques

Evaluation Paradigms & Techniques. IS 588 Spring 2008 Dr. D. Bilal. Overview. Evaluation is performed to determine how well a certain product design meets user needs. Need to decide what to evaluate? Guided by goals, theory, model, etc. What to evaluate determines how to do the evaluation.

Download Presentation

Evaluation Paradigms & Techniques

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal

  2. Overview • Evaluation is performed to determine how well a certain product design meets user needs. • Need to decide what to evaluate? • Guided by goals, theory, model, etc. • What to evaluate determines how to do the evaluation

  3. Evaluation Paradigms • Quick & Dirty • Usability testing • Field studies • Predictive evaluation

  4. Quick & Dirty • Informal • Designers or evaluators meet informally with users • Gather information about product design • Gather suggestions for design improvements • Inexpensive • Not time consuming

  5. Usability Testing • Formal assessment • Measures user performance on predefined tasks • Tasks structured based on purpose of evaluation • Controlled by evaluator • Performance observed and/or captured • EXAMPLES? • based on questions guiding usability testing (i.e., what the evaluator wants to find)

  6. Usability Testing • Typically quantitative • Interviews and questionnaires can result in qualitative assessments • User comments, quotes of likes/dislikes, etc. • A mix method is ideal • WHY?

  7. Usability Testing • Not performed in a naturalistic setting • Activities can be captured/recorded using software (e.g., Morae, HyperCam, Camtasia), or videotape • Evaluator may take observational notes while activities being captured

  8. Field Studies • Naturalistic setting • User interacts with system as part of a daily routine • No tasks given by evaluator • Evaluator observes and records activities, OR uses software to capture activities, OR… • Can be qualitative and quantitative • HOW?

  9. Predictive Evaluation • Experts place themselves in the users’ shoes to predict usability problems • Guided by heuristics • Quick, inexpensive • Limitations • WHAT ARE THEY?

  10. Evaluation Techniques • Observe user • Gather user opinion • Gather expert opinion • Test user performance • Model user performance • Mix method (2 or more techniques)

  11. DECIDE Framework • Determine goals • Explore/set questions to be answered • Choose suitable paradigms and techniques • Identify issues (e.g., how to recruit participants)

  12. DECIDE Framework • Decide on tackling ethical concerns (e.g., use of human subjects, privacy) • Evaluate, interpret, present data

More Related