scholarly journals Auditory motion processing after early blindness

2014 ◽  
Vol 14 (13) ◽  
pp. 4-4 ◽  
Author(s):  
F. Jiang ◽  
G. C. Stecker ◽  
I. Fine
2007 ◽  
Vol 45 (3) ◽  
pp. 523-530 ◽  
Author(s):  
A. Brooks ◽  
R. van der Zwan ◽  
A. Billard ◽  
B. Petreska ◽  
S. Clarke ◽  
...  

2013 ◽  
Vol 109 (2) ◽  
pp. 321-331 ◽  
Author(s):  
David A. Magezi ◽  
Karin A. Buetler ◽  
Leila Chouiter ◽  
Jean-Marie Annoni ◽  
Lucas Spierer

Following prolonged exposure to adaptor sounds moving in a single direction, participants may perceive stationary-probe sounds as moving in the opposite direction [direction-selective auditory motion aftereffect (aMAE)] and be less sensitive to motion of any probe sounds that are actually moving (motion-sensitive aMAE). The neural mechanisms of aMAEs, and notably whether they are due to adaptation of direction-selective motion detectors, as found in vision, is presently unknown and would provide critical insight into auditory motion processing. We measured human behavioral responses and auditory evoked potentials to probe sounds following four types of moving-adaptor sounds: leftward and rightward unidirectional, bidirectional, and stationary. Behavioral data replicated both direction-selective and motion-sensitive aMAEs. Electrical neuroimaging analyses of auditory evoked potentials to stationary probes revealed no significant difference in either global field power (GFP) or scalp topography between leftward and rightward conditions, suggesting that aMAEs are not based on adaptation of direction-selective motion detectors. By contrast, the bidirectional and stationary conditions differed significantly in the stationary-probe GFP at 200 ms poststimulus onset without concomitant topographic modulation, indicative of a difference in the response strength between statistically indistinguishable intracranial generators. The magnitude of this GFP difference was positively correlated with the magnitude of the motion-sensitive aMAE, supporting the functional relevance of the neurophysiological measures. Electrical source estimations revealed that the GFP difference followed from a modulation of activity in predominantly right hemisphere frontal-temporal-parietal brain regions previously implicated in auditory motion processing. Our collective results suggest that auditory motion processing relies on motion-sensitive, but, in contrast to vision, non-direction-selective mechanisms.


2014 ◽  
Vol 40 (3) ◽  
pp. 265-272 ◽  
Author(s):  
L. B. Shestopalova ◽  
E. A. Petropavlovskaia ◽  
S. Ph. Vaitulevich ◽  
N. I. Nikitin

2018 ◽  
Author(s):  
Ceren Battal ◽  
Mohamed Rezk ◽  
Stefania Mattioni ◽  
Jyothirmayi Vadlamudi ◽  
Olivier Collignon

ABSTRACTThe ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is however poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to left, right, up and down moving as well as static sounds. Whole brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human Planum Temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were however significantly distinct. Altogether our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.SIGNIFICANCE STATEMENTIn comparison to what we know about visual motion, little is known about how the brain implements spatial hearing. Our study reveals that motion directions and sound source locations can be reliably decoded in the human Planum Temporale (hPT) and that they rely on partially shared pattern geometries. Our study therefore sheds important new lights on how computing the location or direction of sounds are implemented in the human auditory cortex by showing that those two computations rely on partially shared neural codes. Furthermore, our results show that the neural representation of moving sounds in hPT follows a “preferred axis of motion” organization, reminiscent of the coding mechanisms typically observed in the occipital hMT+/V5 region for computing visual motion.


Author(s):  
Alexandra A. Ludwig ◽  
Rudolf Rübsamen ◽  
Gerd J. Dörrscheidt ◽  
Sonja A. Kotz

2010 ◽  
Vol 77 (3) ◽  
pp. 328-329
Author(s):  
L.B. Shestopalova ◽  
E.A. Petropavlovskaia ◽  
S.Ph. Vaitulevich ◽  
Y.A. Vasilenko

2019 ◽  
Vol 116 (20) ◽  
pp. 10081-10086 ◽  
Author(s):  
Elizabeth Huber ◽  
Fang Jiang ◽  
Ione Fine

Previous studies report that human middle temporal complex (hMT+) is sensitive to auditory motion in early-blind individuals. Here, we show that hMT+ also develops selectivity for auditory frequency after early blindness, and that this selectivity is maintained after sight recovery in adulthood. Frequency selectivity was assessed using both moving band-pass and stationary pure-tone stimuli. As expected, within primary auditory cortex, both moving and stationary stimuli successfully elicited frequency-selective responses, organized in a tonotopic map, for all subjects. In early-blind and sight-recovery subjects, we saw evidence for frequency selectivity within hMT+ for the auditory stimulus that contained motion. We did not find frequency-tuned responses within hMT+ when using the stationary stimulus in either early-blind or sight-recovery subjects. We saw no evidence for auditory frequency selectivity in hMT+ in sighted subjects using either stimulus. Thus, after early blindness, hMT+ can exhibit selectivity for auditory frequency. Remarkably, this auditory frequency tuning persists in two adult sight-recovery subjects, showing that, in these subjects, auditory frequency-tuned responses can coexist with visually driven responses in hMT+.


2001 ◽  
Vol 85 (1) ◽  
pp. 23-33 ◽  
Author(s):  
Neil J. Ingham ◽  
Heledd C. Hart ◽  
David McAlpine

We examined responses from 91 single-neurons in the inferior colliculus (IC) of anesthetized guinea pigs to auditory apparent motion in the free field. Apparent motion was generated by presenting 100-ms tone bursts, separated by 50-ms silent intervals, at consecutive speaker positions in an array of 11 speakers, positioned in an arc ±112.5° around midline. Most neurons demonstrated discrete spatial receptive fields (SRFs) to apparent motion in the clockwise and anti-clockwise directions. However, SRFs showed marked differences for apparent motion in opposite directions. In virtually all neurons, mean best azimuthal positions for SRFs to opposite directions occurred at earlier positions in the motion sweep, producing receptive fields to the two directions of motion that only partially overlapped. Despite this, overall spike counts to the two directions were similar for equivalent angular velocities. Responses of 28 neurons were recorded to stimuli with different duration silent intervals between speaker presentations, mimicking different apparent angular velocities. Increasing the stimulus off time increased neuronal discharge rates, particularly at later portions of the apparent motion sweep, and reduced the differences in the SRFs to opposite motion directions. Consequently SRFs to both directions broadened and converged with decreasing motion velocity. This expansion was most obvious on the outgoing side of the each SRF. Responses of 11 neurons were recorded to short (90°) partially overlapping apparent motion sweeps centered at different spatial positions. Nonoverlapping response profiles were recorded in 9 of the 11 neurons tested and confirmed that responses at each speaker position were dependent on the preceding response history. Together these data are consistent with the suggestion that a mechanism of adaptation of excitation contributes to the apparent sensitivity of IC neurons to auditory motion cues. In addition, the data indicate that the sequential activation of an array of speakers to produce apparent auditory motion may not be an optimal stimulus paradigm to separate the temporal and spatial aspects of auditory motion processing.


Sign in / Sign up

Export Citation Format

Share Document