Neural Correlates in the Processing of Auditory Spatial Cues and Stimuli Direction

2007 ◽  
Author(s):  
Marco Sperduti ◽  
Ralf Veit ◽  
Andrea Caria ◽  
Paolo Belardinelli ◽  
Niels Birbaumer ◽  
...  
Author(s):  
Thomas Z. Strybel ◽  
Jan M. Boucher ◽  
Greg E. Fujawa ◽  
Craig S. Volp

The effectiveness of auditory spatial cues in visual search performance was examined in three experiments. Auditory spatial cues are more effective than abrupt visual onsets when the target appears in the peripheral visual field or when the contrast of the target is degraded. The duration of the auditory spatial cue did not affect search performance.


2011 ◽  
Vol 130 (4) ◽  
pp. 2313 ◽  
Author(s):  
Graham Naylor ◽  
S. Gert Weinrich

2014 ◽  
Vol 111 (22) ◽  
pp. E2339-E2348 ◽  
Author(s):  
M. W. H. Remme ◽  
R. Donato ◽  
J. Mikiel-Hunter ◽  
J. A. Ballestero ◽  
S. Foster ◽  
...  

2019 ◽  
Author(s):  
Daniel P. Kumpik ◽  
Connor Campbell ◽  
Jan W.H. Schnupp ◽  
Andrew J King

AbstractSound localization requires the integration in the brain of auditory spatial cues generated by interactions with the external ears, head and body. Perceptual learning studies have shown that the relative weighting of these cues can change in a context-dependent fashion if their relative reliability is altered. One factor that may influence this process is vision, which tends to dominate localization judgments when both modalities are present and induces a recalibration of auditory space if they become misaligned. It is not known, however, whether vision can alter the weighting of individual auditory localization cues. Using non-individualized head-related transfer functions, we measured changes in subjects’ sound localization biases and binaural localization cue weights after ~55 minutes of training on an audiovisual spatial oddball task. Four different configurations of spatial congruence between visual and auditory cues (interaural time differences (ITDs) and frequency-dependent interaural level differences (interaural level spectra, ILS) were used. When visual cues were spatially congruent with both auditory spatial cues, we observed an improvement in sound localization, as shown by a reduction in the variance of subjects’ localization biases, which was accompanied by an up-weighting of the more salient ILS cue. However, if the position of either one of the auditory cues was randomized during training, no overall improvement in sound localization occurred. Nevertheless, the spatial gain of whichever cue was matched with vision increased, with different effects observed on the gain for the randomized cue depending on whether ITDs or ILS were matched with vision. As a result, we observed a similar up-weighting in ILS when this cue alone was matched with vision, but no overall change in binaural cue weighting when ITDs corresponded to the visual cues and ILS were randomized. Consistently misaligning both cues with vision produced the ventriloquism aftereffect, i.e., a corresponding shift in auditory localization bias, without affecting the variability of the subjects’ sound localization judgments, and no overall change in binaural cue weighting. These data show that visual contextual information can invoke a reweighting of auditory localization cues, although concomitant improvements in sound localization are only likely to accompany training with fully congruent audiovisual information.


Author(s):  
Thomas Z. Strybel

Techniques for the production of externalized, “3-dimensional” sound images for acoustic signals presented via headphone were developed in the past decade. These 3-D sound systems simulate both interaural time and intensity cues, and cues based on the action of the pinnae on incoming sound sources (e.g. Wenzel, Wightman and Foster, 1988). It has been anticipated that these 3-D sound systems would be useful in the cockpit and other work settings because they provide a natural method directing an operator to some event in the environment. This symposium is a progress report on research which has either examined potential applications of 3-D sound systems in the workplace, or attempted to understand how auditory spatial cues direct visual attention. Researchers at NASA Ames Research Center and Wright Patterson Air Force Base have identified cockpit tasks that can benefit from auditory spatial cueing. Some of these tasks include gate identification, blunder avoidance, and traffic identification of approaching and receding targets. The benefits of audio spatial cueing are usually measured by determining the reduction in search latency that is realized when searching for targets with and without auditory spatial cues. These benefits can be explained by the findings that both simple detection and identification times are faster and more constant across the frontal hemifield when auditory spatial cues are presented with the target. Furthermore, for sounds presented in the central visual field, auditory spatial cues can either supplement or substitute for abrupt visual onsets in directing visual attention.


PLoS ONE ◽  
2010 ◽  
Vol 5 (4) ◽  
pp. e10396 ◽  
Author(s):  
Ilana B. Witten ◽  
Phyllis F. Knudsen ◽  
Eric I. Knudsen

Sign in / Sign up

Export Citation Format

Share Document