470 likes | 686 Views
Chapter 12: Auditory Localization and Organization. Auditory Localization. Auditory space - surrounds an observer and exists wherever there is sound Researchers study how sounds are localized in space by using: Azimuth coordinates - position left to right
E N D
Auditory Localization • Auditory space - surrounds an observer and exists wherever there is sound • Researchers study how sounds are localized in space by using: • Azimuth coordinates - position left to right • Elevation coordinates - position up and down • Distance coordinates - position from observer
Auditory Localization - continued • On average, people can localize sounds • Directly in front of them most accurately • To the sides and behind their heads least accurately. • Location cues are not contained in the receptor cells like on the retina in vision; thus, location for sounds must be calculated.
Binaural Cues for Sound Localization • Binaural cues - location cues based on the comparison of the signals received by the left and right ears • Interaural time difference (ITD)- difference between the times sounds reach the two ears • When distance to each ear is the same, there are no differences in time. • When the source is to the side of the observer, the times will differ.
Binaural Cues for Sound Localization - continued • Interaural level difference (ILD) - difference in sound pressure level reaching the two ear • Reduction in intensity occurs for high frequency sounds for the far ear • The head casts an acoustic shadow. • This effect doesn’t occur for low frequency sounds. • Cone of Confusion
Monaural Cue for Sound Location • Monaural cue – uses information from one ear • The pinna and head affect the intensities of frequencies. • Measurements have been performed by placing small microphones in ears and comparing the intensities of frequencies with those at the sound source. • This is a spectral cue since the information for location comes from the spectrum of frequencies.
The Physiological Auditory Location • Auditory nerve fibers synapse in a series of subcortical structures • Cochlear nucleus • Superior olivary nucleus (in the brain stem) • Inferior colliculus (in the midbrain) • Medial geniculate nucleus (in the thalamus) • Auditory receiving area (A1 in the temporal lobe)
The Physiological Auditory Location - continued • Hierarchical processing occurs in the cortex • Neural signals travel through the core, then belt, followed by the parabelt area. • Simple sounds cause activation in the core area. • Belt and parabelt areas are activated in response to more complex stimuli made up of many frequencies.
The Physiological Representation of Auditory Space - continued • Jeffress Model for narrowly tuned ITD neurons • These neurons receive signals from both ears. • Coincidence detectors fire only when signals arrive from both ears simultaneously. • Other neurons in the circuit fire to locations corresponding to other ITDs.
Broad ITD Tuning Curves in Mammals • Broadly-tuned ITD neurons • Research on gerbils indicates that neurons in the left hemisphere respond best to sound from the right, and vice versa. • Location of sound is indicated by the ratio of responding for two types of neurons. • This is a distributed coding system.
Auditory Where (and What) Pathways • What, or ventral stream, starts in the anterior portion of the core and belt and extends to the prefrontal cortex. • It is responsible for identifying sounds. • Where, or dorsal stream, starts in the posterior core and belt and extends to the parietal and prefrontal cortices. • It is responsible for locating sounds. • Evidence from neural recordings, brain damage, and brain scanning support these findings.
Hearing Inside Rooms • Direct sound - sound that reaches the listener’s ears straight from the source • Indirect sound - sound that is reflected off of environmental surfaces and then to the listener • When a listener is outside, most sound is direct; however inside a building, there is direct and indirect sound.
Perceiving Two Sounds That Reach the Ears at Different Times • Experiment by Litovsky et al. • Listeners sat between two speakers: a lead speaker and a lag speaker. • When sound comes from the lead speaker followed by the lag speaker with a long delay, listeners hear two sounds. • When the delay is decreased to 5 - 20 msec, listeners hear the sound as only coming from the lead speaker - the precedence effect.
Architectural Acoustics • The study of how sounds are reflected in rooms. • Factors that affect perception in concert halls. • Reverberation time - the time it takes sound to decrease to 1/1000th of its original pressure • If it is too long, sounds are “muddled.” • If it is too short, sounds are “dead.” • Ideal times are around two seconds.
Architectural Acoustics - continued • Factors that Affect Perception in Concert Halls • Intimacy time - time between when sound leaves its source and when the first reflection arrives • Best time is around 20 ms. • Bass ratio - ratio of low to middle frequencies reflected from surfaces • High bass ratios are best. • Spaciousness factor - fraction of all the sound received by listener that is indirect • High spaciousness factors are best.
Acoustics in Classrooms - continued • Ideal reverberation time in classrooms is • .4 to .6 second for small classrooms. • 1.0 to 1.5 seconds for auditoriums. • These maximize ability to hear voices. • Most classrooms have times of one second or more. • Background noise is also problematic. • Signal to noise ratio should be +10 to +15 dB or more.
Auditory Organization: Scene Analysis • Auditory Scene - the array of all sound sources in the environment • Auditory Scene Analysis - process by which sound sources in the auditory scene are separated into individual perceptions • This does not happen at the cochlea since simultaneous sounds are together in the pattern of vibration of the basilar membrane.
Auditory Organization: Scene Analysis - continued • Heuristics that help to perceptually organize stimuli • Onset time - sounds that start at different times are likely to come from different sources • Location - a single sound source tends to come from one location and to move continuously • Similarity of timbre and pitch - similar sounds are grouped together
Separating the Sources • Compound melodic line in music is an example of auditory stream segregation. • Experiment by Bregman and Campbell • Stimuli were alternating high and low tones • When stimuli played slowly, the perception is hearing high and low tones alternating. • When the stimuli are played quickly, the listener hears two streams; one high and one low.
Separating the Sources - continued • Experiment by Deutsch - the scale illusion or melodic channeling • Stimuli were two sequences alternating between the right and left ears. • Listeners perceive two smooth sequences by grouping the sounds by similarity in pitch. • This demonstrates the perceptual heuristic that sounds with the same frequency come from the same source, which is usually true in the environment.
Separating the Sources - continued • Proximity in time - sounds that occur in rapid succession usually come from the same source • This principle was illustrated in auditory streaming. • Auditory continuity - sounds that stay constant or change smoothly are usually from the same source
Separating the Sources - continued • Experiment by Warren et al. • Tones were presented interrupted by gaps of silence or by noise. • In the silence condition, listeners perceived that the sound stopped during the gaps. • In the noise condition, the perception was that the sound continued behind the noise.
Separating the Sources - continued • Effect of past experience • Experiment by Dowling • Melody “Three Blind Mice” is played with notes alternating between octaves • Listeners find it difficult to identify the song • But after they hear the normal melody, they can then hear it in the modified version using melody schema
Connections Between Hearing and Vision • Visual capture or the ventriloquist effect - an observer perceives the sound as coming from the visual location rather than the source for the sound • Experiment by Sekuler et al. • Balls moving without sound appeared to move past each other. • Balls with an added “click” appeared to collide.
Hearing and Vision: Physiology • The interaction between vision and hearing is multisensory in nature. • Thaler et al (2011) – Used expert blind echolocators to create clicking sounds and observed how these signals activated the brain.