140 likes | 239 Views
Touch through Sound. Audio Perception and Audio Experience. Ear Sound Noise Meaning Design guidelines. Ear. different frequencies set off different neuron groups in auditory system. to brain. why filled with water-like fluid?. semicircular canals. eardrum. stirrup.
E N D
Touch through Sound Johan F. Hoorn, 2003
Audio Perception and Audio Experience Ear Sound Noise Meaning Design guidelines Johan F. Hoorn, 2003
Ear different frequencies set off different neuron groups in auditory system to brain why filled with water-like fluid? semicircular canals eardrum stirrup anvil hammer Three representations of a cochlear implant Cochlear’s Nucleus 24 Contour implant Purpose: Study design 3-D CAD/CAM Cochlear’s self-curling Contour Electrode array Purpose: Rapid prototyping, compare models, save time and money Purpose: Explain physiological context Virtual Reality digital model Johan F. Hoorn, 2003 http://www.ee.duke.edu/research/lcollins/images/implant.gif http://www.cadinfo.net/editorial/cochlear.htm, Feb. 19, 2003
Sound amplitude (deciBell - logarithmic scale) frequency (Herz - peaks per second) Pure tone: Decomposition into sinus waves dense (compression) one cycle thin (expansion) Intel® Play™ Computer Sound Morpher http://www.intel.com/technology/itj/q42001/articles/art_5.htm, Feb. 19, 2003 Johan F. Hoorn, 2003
Noise High frequencies such as MegaHertz (million cycles/sec.) cannot be heard by humans Why is temperature measured? How to filter it? Human hearing range: 20 Hz - 15kHz At low frequencies, discrimination of deviations of 1.5 Hz Molecule jitter affects sound conduction Human adaptation because of focusing attention: While hearing music you can focus on one instrument Johan F. Hoorn, 2003 http://www.rawood.com/LabVIEW/Noise_fig_testp.jpg, Feb. 19, 2003
Meaning E.g., a warning such as a System Error bleep E.g., when indicating interaction changes E.g., an artificial earcon (melody) is easier distinguished from surrounding sounds than the natural sound of an auditory icon (Sikora et al., 1995) Basic processes Detection is it there? Attention what’s up? Localization where from? Relative discrimination are there more? Identification it is this one! F1 Help (Utopia Question.wav) earcon auditory icon Johan F. Hoorn, 2003
Meaning Visual prime Basic meaning Higher meaning You’ve got a message Attention You’ve got a warning Original context of Sound528.wav Complex processes Semantic priming by context cues (click here for Sound528.wav) Attention Johan F. Hoorn, 2003
Meaning Language covers all Text > speech Information rich Information poor Screaming colors rather than colorful screams (asymmetric synaesthesia) poor addition to rich focus rich addition to poor focus Complex processes Auditory signals have less meaning of their own than visuals and language This has to do with the rank of human channels for data input Visual > auditory > haptic > olfactory > taste People prefer interpreting representations with an information-rich focus: Johan F. Hoorn, 2003
Meaning Acoustic interface with visual backup Graphic interface with auditory backup However, if you have to prime you better take a rich cue than a poor Sound528.wav Sound528.wav rich addition to poor focus poor addition to poor focus http://www.tekey.com/images/trrick_optics/get_fp001.jpg, Feb. 21, 2003 Complex processes Thus, sound is complementary to visuals rather than vice versa (cf. Lashina, 2001) So what is the better representation? Sound528.wav Sound528.wav rich addition to poor focus? poor addition to rich focus? The most informative, then, is to have visual additions to text (rich to richest) Johan F. Hoorn, 2003
Meaning Law of Signification (1) The more a signal has autonomous meaning (context free, high-frequency use), the more it enhances the corresponding meaning of other signals (1a) The more a lower level sign(al) has autonomous meaning, the more it enhances the corresponding meaning of higher level sign(al)s (1b) The more a lower level sign(al) has autonomous meaning, the better it can do without semantic priming by context cues Johan F. Hoorn, 2003 http://netschaap.nl/images/justitia.gif, Feb. 21, 2003
Design guidelines Use sound - In speech applications (of course) - If the signal should simulate real life experience (e.g., button clicks in touchscreen) - As a warning under conditions of ‘no see, no touch’ (e.g., security patrol) - As complementary under conditions of visual overload (e.g., fighter pilot) Johan F. Hoorn, 2003 http://www.stoopy.com/cgi-bin/pocketil/files/pictures2/UT_Everyday_Speak.jpg, Feb. 21, 2003
Design guidelines Sanders, McCormick (1992). Human Factors in Engineering and Design, 7th edition, Singapore, McGraw Hill - Compatibility: Signals should correspond to natural sound experience - Approximation: Two-stage signals should be clear in their message 1) Basic meaning (e.g., attention) 2) Higher meaning (where, what for, how) - Parsimony: Signal should provide the only necessary information - Avoid extremes: High intensity signals cause distraction or disruption of response - Do not overload the auditory system - Environment: Comply with signal-to-noise ratio (how noisy is the interface or work environment. Can users hear your sounds?) Johan F. Hoorn, 2003
Design guidelines Sanders, McCormick (1992). Human Factors in Engineering and Design, 7th edition, Singapore, McGraw Hill Design trade-off: - Invariance: Type of signal should be constant in its meaning (cf. Sound528.wav used for different purposes, which is bad) - Use interrupted or variable signals: Humans adapt to signals and might react less on those signals in the future Johan F. Hoorn, 2003
Your problem in ‘sonification’ of interface: No straightforward similarity between interface features and everyday sounds and no ‘interface-sound language’ that defines such meanings a priori. In Sound and Touch.pdf (www.cs.vu.nl/mmc/tbr > extra) you will get many questions and exercises to sonify the interface yet Johan F. Hoorn, 2003