1 / 13

Processes of Design Fifth lecture: Evaluation 24 October 2003

Processes of Design Fifth lecture: Evaluation 24 October 2003. William Newman. Evaluation: What do we Mean?. In the dictionary: Determining or estimating the value of… In interactive system design: Identifying faults Testing perfor- mance Rating quality,eg from 1 to 10.

Download Presentation

Processes of Design Fifth lecture: Evaluation 24 October 2003

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Processes of DesignFifth lecture:Evaluation 24 October 2003 William Newman

  2. Evaluation:What do we Mean? In the dictionary:Determining or estimating the value of… In interactive system design: • Identifying faults • Testing perfor-mance • Rating quality,egfrom 1 to 10

  3. Evaluation in the Inner Loop • Assessing design options • The ATM design session D. Well, one of the things you could do is, presumably you could just whisk your card through, rather than have to put it in the slot and get it back, and that might be quicker than waiting for it to come out. And I don’t see why they don’t do that always, anyway. I suppose it’s so they can chew it up if they want to. J. Yes. • Evaluating by simulating.

  4. Alternatives to simulation • Evaluating with theory-based lawse.g. Fitts’ Law:Tpos = K log2(A / W + 1) • Evaluating with an informal rule or heuristic • The quest for accuracy and completeness.

  5. Two ways to evaluate • Empirical: conduct an experiment • Analytical: evaluate on paper • Why two ways? • Cost of simulation.

  6. What do we simulate? • The technology • The user • The context • The task • Why are these familiar to us?

  7. Evaluating in terms of requirements • The problem statement: • Design a camera-based text-capture system to enable students in libraries to copy text from paper to word processor in less time than by typing. • A concise statement of requirements • As we expand requirements, we refine our evaluations

  8. Simulating the technology • How to avoid building the complete and final system? • Working and non-working simulations • What’s missing from non-working sim’s? • Wizard of Oz methods.

  9. Xerox Star • Designed in the late 1970s • Technology performance problems • Escalating costs • Never fully evaluated

  10. Simulating the User • You’re not them! • Knowing the user • Xerox Star again

  11. Simulating the Context • Taking account of: • Other systems (Airport control, Aegis) • Other people (GPs) • Physical environment

  12. Simulating the Task • Sets up the process of evaluation • Easy? We design the task • Not so easy? The user can redesign it(the computer game) • and may never learn it.

  13. The Limits of Evaluation

More Related