60 likes | 71 Views
David Merrill & Pattie Maes MIT Media Lab. Augmenting Looking, Pointing and Reaching Gestures to Enhance the Searching AndBrowsing of Physical Objects. Approach. Objects in the real world have metadata but this can be hard to extract
E N D
David Merrill & Pattie Maes MIT Media Lab Augmenting Looking, Pointing and Reaching Gestures to Enhance the Searching AndBrowsing of Physical Objects
Approach Objects in the real world have metadata but this can be hard to extract Current devices are not hands free or socially acceptable (Rekimoto) IR based ring and headset built.
Application Engine-info scenario MyShoppingGuide scenario See and interact with object and simultaneously access information Audio (spoken dialogue) and visual feedback (LED) Track discussed/un-discussed items
Experiment & Results Find products in a store satisfying constraints Ring significantly faster in one task, other tasks no significant differences. Reliance on technology varied between users. Some users first filtered via common sense and only used the ring to confirm
Advocate User-centered information access Interact with physical objects to get digital information in an intuitive way at the same time Efficient, personalized, in-situ feedback Work with traditional searching and browsing behavior at the same time
Critic Results not that significant How do you do I/O on these devices? More compelling application areas could have been proposed I wonder how they would compare to handheld, visual devices