80 likes | 177 Views
Event-based Synchronization of Model-Based Multimodal User Interfaces. Marco Blumendorf, Sebastian Feuerstack, Prof. Dr. Sahin Albayrak. Agenda. Background & Motivation The Multi-Access Service Platform Multi-level Event Propagation Conclusion. Motivation. The Virtual Cook
E N D
Event-based Synchronization of Model-Based Multimodal User Interfaces Marco Blumendorf, Sebastian Feuerstack, Prof. Dr. Sahin Albayrak
Agenda • Background & Motivation • The Multi-Access Service Platform • Multi-level Event Propagation • Conclusion MDDAUI 2006 – Event-based Synchronization of Model-Based Multimodal User Interfaces
Motivation • The Virtual Cook • Smart devices in the kitchen • Provide new interaction capabilities • Multimodal user interfaces • Problems The user interface should be • multimodal (graphic, voice, gestures) • flexible (different modalitiesat different times) MDDAUI 2006 – Event-based Synchronization of Model-Based Multimodal User Interfaces
The Multi-Access Service Platform Flexible channels & synchronization via coordination topics of loosely coupled connections in MASP architecture MDDAUI 2006 – Event-based Synchronization of Model-Based Multimodal User Interfaces
Multi-level Event Propagation • The Idea is straight forward … • We have a multi-level UI model, so we use multi-level event propagation. • Advantages • interpretation of events related to the context they appear in • abstraction of interaction events for the different levels • reification of update events for the different levels MDDAUI 2006 – Event-based Synchronization of Model-Based Multimodal User Interfaces
An Example Hierarchical multilevel event propagation using the Cameleon Reference Architecture MDDAUI 2006 – Event-based Synchronization of Model-Based Multimodal User Interfaces
Conclusion and Future Work • Multi-Level Event Propagation as a solution for the coordination of distributed UIs • Multi-Access Service Platform as a Framework for the delivery and management of distributed multimodal UIs • A Virtual Cook application demonstrating the utilization of flexible and changing interaction capabilities • However, our current approach requires one HTML-based main UI supported by additional modalities • We did sufficiently solve the fusion of complementing/conflicting events from different channels yet • We also only consider a limited number of gestures and restricted voice commands at the moment MDDAUI 2006 – Event-based Synchronization of Model-Based Multimodal User Interfaces
The End … • Thank you for your attention. • Marco.Blumendorf@DAI-Labor.dewww.dai-labor.de MDDAUI 2006 – Event-based Synchronization of Model-Based Multimodal User Interfaces