220 likes | 317 Views
LEAD REVIEW ON THE PAPER Search Vox : Leveraging Multimodal Refinement and Partial Knowledge for Mobile Voice Search by Tim Peak et al. Ivan Elhart - ECE 992 Ubiquitous Computing University of New Hampshire 10/09/2008. Problems.
E N D
LEAD REVIEW ON THE PAPERSearch Vox: Leveraging Multimodal Refinement and Partial Knowledge for Mobile Voice SearchbyTim Peak et al. Ivan Elhart - ECE 992 Ubiquitous Computing University of New Hampshire 10/09/2008
Problems • Mobile settings often contain non-stationary noise which cannot be easily canceled • Speakers tend to adopt to surrounding noise in acoustically unhelpful ways Goal • New multimodal interface that will help voice search applications to recover from speech recognition errors Hypothesis • A multimodal interface for mobile voice search that incorporates speech with touch and text may increase recovery rates of the search
Background • Yu, D., Ju, Y.C., Wang, Y.Y., Zweig, G., & Acero, A. 2007. Automated directory assistance system: From theory to practice. Proc. of Interspeech. • Ainsworth, W.A. & Pratt, S.R. 1992. Feedback strategies for error correction in speech recognition systems. International Journal of Man-Machine Studies, 26(6), 833-842. • Hsu, P., Mahajan, M. & Acero, A. 2005. Multimodal text entry on mobile devices. Proc. of ASRU.
Approach • N –best list, whenever recognition is less than perfect • Coupling of speech with touch and text • Leveraging of any partial knowledge
Approach • Word palette • Allows users to select any word of a recognized phrase
Approach • Text hints • Resort to speech whenever search entry is too long or when enough text hints have been provided
Approach • Verbal wildcards • Partial knowledge queries
Approach • Search Vox architecture
Results • Simulation experiments • Utterances collected from Microsoft Live Search Mobile • Automated directory assistance, maps, driving directions, movie times, local gas prices
Results • Word palette
Results • Text hints
LEAD REVIEW ON THE PAPER“It’s Mine, Don’t Touch!”: Interactions at a large Multi-Touch Display in a City CentrebyPeter Peltonen et al. Ivan Elhart - ECE 992 Ubiquitous Computing University of New Hampshire 10/07/2008
Problem • How does the outdoor public tangible interface support simultaneous participation and interaction of multiple users? Goal • To provide first insights into how users approach, participate, and interact on a large multi-touch display in a public space Hypotheses • Observational studies in urban environments could help in understanding how multi-touch screens can affect and support social interactions • Public interactive multi-touch displays can potentially restructure the way people experience and use the space around them
Background • Semi-public displays
Background • Public displays
Approach City Wall Direct manipulation Installation in Helsinki Non-modality
Approach • Data collection • A continuous interaction log was written • Interactions were recorded with a web camera in a 640x480 resolution • Twelve on-site interviews were conducted • Data analysis • Combination of the video and interaction log • First time and returning users • Sessions (a ten second gap between interaction) • Duration • Number of active users • Number of passive bystanders
Results • Findings on how the City Wall was used and how the users interacted with each other at the screen • 8 days of interaction, 8.8% the display was used, 1199 users, 516 sessions, and 202 passive bystanders 72% 23% 18%
Results • Noticing the Display (when the wall was used) • Multi-user interactions (fun to use with others and friends) Rain shelter Stepwise app. Parallel use Team work
Results • Conflict management • Social configurations (or roles) Social inter. Withdrawal Comedian Teacher Leaving a mark
Conclusion Noticing City Wall Installation Parallel use Conflicts