1 / 13

Automated Usability Evaluation during Model-based Interactive System Development

Automated Usability Evaluation during Model-based Interactive System Development. Sebastian Feuerstack, Marco Blumendorf, Maximilian Kern , Michael Kruppa, Michael Quade, Mathias Runge, Sahin Albayrak. Agenda. Motivation Introduction of MeMo (Mental Models)

amma
Download Presentation

Automated Usability Evaluation during Model-based Interactive System Development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Automated Usability Evaluation during Model-based Interactive System Development Sebastian Feuerstack, Marco Blumendorf, Maximilian Kern, Michael Kruppa, Michael Quade, Mathias Runge, Sahin Albayrak

  2. Agenda. Motivation Introduction of MeMo (Mental Models) Introduction of MASP (Multi-Access Service Platform) MeMo meets MASP Case Study “4-Star Cooking Assistant” Evaluation Results “4-Star Cooking Assistant” Conclusion 01.09.2008 2

  3. Motivation • Conventional usability evaluation (UE) methods are expensive in terms of time & money • Reduce modelling effort for UE • Consider multiple contexts-of-use • Evaluate usability in an early development stage • Evaluate usability automatically 01.09.2008 01.09.2008 3

  4. Introduction of MeMo. Modeling Reporting Widgets, Dialogs, Screenshots, Metadata, Workflow Simulator 01.09.2008 4

  5. Introduction of MeMo. • UIM analyses the user interface on a semantic level • UIM assigns initial probabilities for each possible interaction based on user task • Rule engine uses rules which lower or raise assigned probabilities • Interaction is chosen randomly according to probabilities 01.09.2008 5

  6. The Problem. • How to evaluate applications using dynamic data? • What about different context-adaptive applications? • Different user roles? • Different platforms? • Different environments? • How to evaluate such user interfaces? 01.09.2008 6

  7. Introduction of MASP Widgets, Dialogs, Screenshots, Metadata, Workflow UI Generation Runtime Models Interactions Interpreting 01.09.2008 7

  8. MeMo meets MASP Interaction State Mapping Simulator Simulated input 01.09.2008 8

  9. Case Study “4-Star Cooking Assistant” • Key Features • Search for recipes • Adjust number of persons • Create shopping list • Step-by-step instructions for preparing the meal 01.09.2008 9

  10. Evaluation Results “4-Star Cooking Assistant”. RuleFired: Low Contrastof Font Background Rule Fired: Non-conform coded Link RuleFired: Small Font UIM often made mistakes here (missing hint „next“) Generate Shoppinglist 01.09.2008 10

  11. Evaluation Results “4-Star Cooking Assistant”. Late feedback (State CookingAssistant) on wrongly selected menu item (State RecipeDetails) 01.09.2008 11

  12. Conclusions • Used models are suitable for interoperability • Automated usability evaluation (AUE) is feasible in early development phase • Runtime models can be utilized for fully AUE • Simple UE of various contexts-of-use • Future work • Better support for multimodality • Extend context information • Enrich the perception of the simulated user • Give concrete hints for usability improvement (critique level) 01.09.2008 12

  13. Automated Usability Evaluation during Model-based Interactive System Development Thank you for your attention! Questions? maximilian.kern@dai-labor.de More informationincludingvideos, papersandsoftware online at: http://masp.dai-labor.de

More Related