scholarly journals Representation of auditory motion directions and sound source locations in the human planum temporale

2018 ◽  
Author(s):  
Ceren Battal ◽  
Mohamed Rezk ◽  
Stefania Mattioni ◽  
Jyothirmayi Vadlamudi ◽  
Olivier Collignon

ABSTRACTThe ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is however poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to left, right, up and down moving as well as static sounds. Whole brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human Planum Temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were however significantly distinct. Altogether our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.SIGNIFICANCE STATEMENTIn comparison to what we know about visual motion, little is known about how the brain implements spatial hearing. Our study reveals that motion directions and sound source locations can be reliably decoded in the human Planum Temporale (hPT) and that they rely on partially shared pattern geometries. Our study therefore sheds important new lights on how computing the location or direction of sounds are implemented in the human auditory cortex by showing that those two computations rely on partially shared neural codes. Furthermore, our results show that the neural representation of moving sounds in hPT follows a “preferred axis of motion” organization, reminiscent of the coding mechanisms typically observed in the occipital hMT+/V5 region for computing visual motion.

2021 ◽  
Author(s):  
Ceren Battal ◽  
Ane Gurtubay-Antolin ◽  
Mohamed Rezk ◽  
Stefania Mattioni ◽  
Giorgia Bertonati ◽  
...  

How does blindness affect the brain network supporting spatial hearing? We used a combined functional and diffusion MRI approach to study the impact of early blindness on the brain networks typically coding for audio–visual motion and location. Whole-brain functional univariate analysis revealed preferential response to auditory motion in a dorsal network including the planum temporale (hPT) as well as the anterior portion of the middle temporal cortex (hMT+/V5) in both sighted and congenitally blind participants (male and female). Blind participants showed additional preferential response to auditory motion in the posterior region of hMT+/V5. Importantly, multivariate decoding analysis revealed the presence of motion direction information that was higher in hMT+/V5 and lower in hPT of blind relative to sighted people. Decoding sound source location showed a similar pattern of results even if the decoding accuracies were in general lower than those obtained from motion directions. Diffusion MRI revealed that the macrostructure (trajectory and connectivity index) of hMT+/V5 — hPT connectivity did not differ between groups, while the microstructure of the connections was altered in blind people. These results suggest that early visual deprivation triggers a network-level reorganization that enhances the recruitment of occipital areas in conjunction with a release in the computational workload of temporal regions typically dedicated to spatial hearing. This functional reorganization is accompanied by white-matter microstructural alterations in related occipital-temporal connections.


2014 ◽  
Vol 111 (1) ◽  
pp. 112-127 ◽  
Author(s):  
L. Thaler ◽  
J. L. Milne ◽  
S. R. Arnott ◽  
D. Kish ◽  
M. A. Goodale

We have shown in previous research (Thaler L, Arnott SR, Goodale MA. PLoS One 6: e20162, 2011) that motion processing through echolocation activates temporal-occipital cortex in blind echolocation experts. Here we investigated how neural substrates of echo-motion are related to neural substrates of auditory source-motion and visual-motion. Three blind echolocation experts and twelve sighted echolocation novices underwent functional MRI scanning while they listened to binaural recordings of moving or stationary echolocation or auditory source sounds located either in left or right space. Sighted participants' brain activity was also measured while they viewed moving or stationary visual stimuli. For each of the three modalities separately (echo, source, vision), we then identified motion-sensitive areas in temporal-occipital cortex and in the planum temporale. We then used a region of interest (ROI) analysis to investigate cross-modal responses, as well as laterality effects. In both sighted novices and blind experts, we found that temporal-occipital source-motion ROIs did not respond to echo-motion, and echo-motion ROIs did not respond to source-motion. This double-dissociation was absent in planum temporale ROIs. Furthermore, temporal-occipital echo-motion ROIs in blind, but not sighted, participants showed evidence for contralateral motion preference. Temporal-occipital source-motion ROIs did not show evidence for contralateral preference in either blind or sighted participants. Our data suggest a functional segregation of processing of auditory source-motion and echo-motion in human temporal-occipital cortex. Furthermore, the data suggest that the echo-motion response in blind experts may represent a reorganization rather than exaggeration of response observed in sighted novices. There is the possibility that this reorganization involves the recruitment of “visual” cortical areas.


2012 ◽  
Vol 25 (0) ◽  
pp. 140
Author(s):  
Lore Thaler ◽  
Jennifer Milne ◽  
Stephen R. Arnott ◽  
Melvyn A. Goodale

People can echolocate their distal environment by making mouth-clicks and listening to the click-echoes. In previous work that used functional magnetic resonance imaging (fMRI) we have shown that the processing of echolocation motion increases activity in posterior/inferior temporal cortex (Thaler et al., 2011). In the current study we investigated, if brain areas that are sensitive to echolocation motion in blind echolocation experts correspond to visual motion area MT+. To this end we used fMRI to measure brain activity of two early blind echolocation experts while they listened to recordings of echolocation and auditory source sounds that could be either moving or stationary, and that could be located either to the left or to the right of the listener. A whole brain analysis revealed that echo motion and source motion activated different brain areas in posterior/inferior temporal cortex. Furthermore, the relative spatial arrangement of echo and source motion areas appeared to match the relative spatial arrangement of area MT+ and source motion areas that has been reported for sighted people (Saenz et al., 2008). Furthermore, we found that brain areas that were sensitive to echolocation motion showed a larger response to echo motion presented in contra-lateral space, a response pattern typical for visual motion processing in area MT+. In their entirety the data are consistent with the idea that brain areas that process echolocation motion in blind echolocation experts correspond to area MT+.


2020 ◽  
Vol 117 (50) ◽  
pp. 32165-32168
Author(s):  
Arvid Guterstam ◽  
Michael S. A. Graziano

Recent evidence suggests a link between visual motion processing and social cognition. When person A watches person B, the brain of A apparently generates a fictitious, subthreshold motion signal streaming from B to the object of B’s attention. These previous studies, being correlative, were unable to establish any functional role for the false motion signals. Here, we directly tested whether subthreshold motion processing plays a role in judging the attention of others. We asked, if we contaminate people’s visual input with a subthreshold motion signal streaming from an agent to an object, can we manipulate people’s judgments about that agent’s attention? Participants viewed a display including faces, objects, and a subthreshold motion hidden in the background. Participants’ judgments of the attentional state of the faces was significantly altered by the hidden motion signal. Faces from which subthreshold motion was streaming toward an object were judged as paying more attention to the object. Control experiments showed the effect was specific to the agent-to-object motion direction and to judging attention, not action or spatial orientation. These results suggest that when the brain models other minds, it uses a subthreshold motion signal, streaming from an individual to an object, to help represent attentional state. This type of social-cognitive model, tapping perceptual mechanisms that evolved to process physical events in the real world, may help to explain the extraordinary cultural persistence of beliefs in mind processes having physical manifestation. These findings, therefore, may have larger implications for human psychology and cultural belief.


2003 ◽  
Vol 14 (4) ◽  
pp. 357-361 ◽  
Author(s):  
Jean Vroomen ◽  
Beatrice de Gelder

In this study, we show that the contingent auditory motion aftereffect is strongly influenced by visual motion information. During an induction phase, participants listened to rightward-moving sounds with falling pitch alternated with leftward-moving sounds with rising pitch (or vice versa). Auditory aftereffects (i.e., a shift in the psychometric function for unimodal auditory motion perception) were bigger when a visual stimulus moved in the same direction as the sound than when no visual stimulus was presented. When the visual stimulus moved in the opposite direction, aftereffects were reversed and thus became contingent upon visual motion. When visual motion was combined with a stationary sound, no aftereffect was observed. These findings indicate that there are strong perceptual links between the visual and auditory motion-processing systems.


NeuroImage ◽  
2021 ◽  
Vol 230 ◽  
pp. 117816 ◽  
Author(s):  
Stefania Benetti ◽  
Joshua Zonca ◽  
Ambra Ferrari ◽  
Mohamed Rezk ◽  
Giuseppe Rabini ◽  
...  

Author(s):  
Daniela Perani ◽  
Paola Scifo ◽  
Guido M. Cicchini ◽  
Pasquale Della Rosa ◽  
Chiara Banfi ◽  
...  

AbstractMotion perception deficits in dyslexia show a large intersubjective variability, partly reflecting genetic factors influencing brain architecture development. In previous work, we have demonstrated that dyslexic carriers of a mutation of the DCDC2 gene have a very strong impairment in motion perception. In the present study, we investigated structural white matter alterations associated with the poor motion perception in a cohort of twenty dyslexics with a subgroup carrying the DCDC2 gene deletion (DCDC2d+) and a subgroup without the risk variant (DCDC2d–). We observed significant deficits in motion contrast sensitivity and in motion direction discrimination accuracy at high contrast, stronger in the DCDC2d+ group. Both motion perception impairments correlated significantly with the fractional anisotropy in posterior ventral and dorsal tracts, including early visual pathways both along the optic radiation and in proximity of occipital cortex, MT and VWFA. However, the DCDC2d+ group showed stronger correlations between FA and motion perception impairments than the DCDC2d– group in early visual white matter bundles, including the optic radiations, and in ventral pathways located in the left inferior temporal cortex. Our results suggest that the DCDC2d+ group experiences higher vulnerability in visual motion processing even at early stages of visual analysis, which might represent a specific feature associated with the genotype and provide further neurobiological support to the visual-motion deficit account of dyslexia in a specific subpopulation.


2020 ◽  
Author(s):  
A. Gurtubay-Antolin ◽  
C. Battal ◽  
C. Maffei ◽  
M. Rezk ◽  
S Mattioni ◽  
...  

ABSTRACTIn humans, the occipital middle-temporal region (hMT+/V5) specializes in the processing of visual motion, while the Planum Temporale (hPT) specializes in auditory motion processing. It has been hypothesized that these regions might communicate directly to achieve fast and optimal exchange of multisensory motion information. In this study, we investigated for the first time in humans the existence of direct white matter connections between visual and auditory motion-selective regions using a combined functional- and diffusion-MRI approach. We found reliable evidence supporting the existence of direct white matter connections between individually and functionally defined hMT+/V5 and hPT. We show that projections between hMT+/V5 and hPT do not overlap with large white matter bundles such as the Inferior Longitudinal Fasciculus (ILF) nor the Inferior Frontal Occipital Fasciculus (IFOF). Moreover, we did not find evidence for the existence of reciprocal projections between the face fusiform area and hPT, supporting the functional specificity of hMT+/V5 – hPT connections. Finally, evidence supporting the existence of hMT+/V5 – hPT connections was corroborated in a large sample of participants (n=114) from the human connectome project. Altogether, this study provides first evidence supporting the existence of direct occipito-temporal projections between hMT+/V5 and hPT which may support the exchange of motion information between functionally specialized auditory and visual regions and that we propose to name the middle (or motion) occipito-temporal track (MOTT).


Author(s):  
Frank Edughom Ekpar

This paper establishes the foundational principles and practice for a unified theory of arbitrary information management by disclosing systems, devices and methods for the management of substrates or biological substrates. In this context, a substrate is any aspect of any entity that is capable of responding to or emitting stimuli irrespective of whether the stimuli actually emanate from any aspect of the entity or not. Management of substrates could be achieved through the management of stimuli that modulate or moderate or influence any aspect of the substrate as well as through the management of any stimuli emanating from the substrate. The results enable a wide range of novel applications in a variety of fields with far-reaching implications. For example, the functional organization of many regions of the brain including the superior temporal cortex which is believed to play a critical role in the hierarchical processing of human visual and auditory stimuli is poorly understood. It is not known precisely which layer within which region of the brain is responsible for which aspect of visual or auditory processing. Simultaneous non-invasive acquisition of bio-signals representing contributions from multiple layers of neuronal populations within the brain could provide new insights leading to the resolution of many of these outstanding issues and provide a deeper understanding of the underlying physiological processes.


Sign in / Sign up

Export Citation Format

Share Document