scholarly journals Resolving multisensory and attentional influences across cortical depth in sensory cortices

2019 ◽  
Author(s):  
Remi Gau ◽  
Pierre-Louis Bazin ◽  
Robert Trampel ◽  
Robert Turner ◽  
Uta Noppeney

ABSTRACTIn our environment our senses are bombarded with a myriad of signals, only a subset of which is relevant for our goals. Using sub-millimeter-resolution fMRI at 7T we resolved BOLD-response and activation patterns across cortical depth in early sensory cortices to auditory, visual and audiovisual stimuli under auditory or visual attention. In visual cortices, auditory stimulation induced widespread inhibition irrespective of attention, whereas auditory relative to visual attention suppressed mainly central visual field representations. In auditory cortices, visual stimulation suppressed activations, but amplified responses to concurrent auditory stimuli, in a patchy topography. Critically, multisensory interactions in auditory cortices were stronger in deeper laminae, while attentional influences were greatest at the surface. These distinct depth-dependent profiles suggest that multisensory and attentional mechanisms regulate sensory processing via partly distinct circuitries. Our findings are crucial for understanding how the brain regulates information flow across senses to interact with our complex multisensory world.

eLife ◽  
2020 ◽  
Vol 9 ◽  
Author(s):  
Remi Gau ◽  
Pierre-Louis Bazin ◽  
Robert Trampel ◽  
Robert Turner ◽  
Uta Noppeney

In our environment, our senses are bombarded with a myriad of signals, only a subset of which is relevant for our goals. Using sub-millimeter-resolution fMRI at 7T, we resolved BOLD-response and activation patterns across cortical depth in early sensory cortices to auditory, visual and audiovisual stimuli under auditory or visual attention. In visual cortices, auditory stimulation induced widespread inhibition irrespective of attention, whereas auditory relative to visual attention suppressed mainly central visual field representations. In auditory cortices, visual stimulation suppressed activations, but amplified responses to concurrent auditory stimuli, in a patchy topography. Critically, multisensory interactions in auditory cortices were stronger in deeper laminae, while attentional influences were greatest at the surface. These distinct depth-dependent profiles suggest that multisensory and attentional mechanisms regulate sensory processing via partly distinct circuitries. Our findings are crucial for understanding how the brain regulates information flow across senses to interact with our complex multisensory world.


Author(s):  
Bruno and

Synaesthesia is a curious anomaly of multisensory perception. When presented with stimulation in one sensory channel, in addition to the percept usually associated with that channel (inducer) a true synaesthetic experiences a second percept in another perceptual modality (concurrent). Although synaesthesia is not pathological, true synaesthetes are relatively rare and their synaesthetic associations tend to be quite idiosyncratic. For this reason, studying synaesthesia is difficult, but exciting new experimental results are beginning to clarify what makes the brain of synaesthetes special and the mechanisms that may produce the condition. Even more importantly, the related phenomenon known as ‘natural’ crossmodal associations is instead experienced by everyone, providing another useful domain for studying multisensory interactions with important implications for understanding our preferences for products in terms of spontaneously evoked associations, as well as for choosing appropriate names, labels, and packaging in marketing applications.


2010 ◽  
Vol 22 (7) ◽  
pp. 1583-1596 ◽  
Author(s):  
Jean Vroomen ◽  
Jeroen J. Stekelenburg

The neural activity of speech sound processing (the N1 component of the auditory ERP) can be suppressed if a speech sound is accompanied by concordant lip movements. Here we demonstrate that this audiovisual interaction is neither speech specific nor linked to humanlike actions but can be observed with artificial stimuli if their timing is made predictable. In Experiment 1, a pure tone synchronized with a deformation of a rectangle induced a smaller auditory N1 than auditory-only presentations if the temporal occurrence of this audiovisual event was made predictable by two moving disks that touched the rectangle. Local autoregressive average source estimation indicated that this audiovisual interaction may be related to integrative processing in auditory areas. When the moving disks did not precede the audiovisual stimulus—making the onset unpredictable—there was no N1 reduction. In Experiment 2, the predictability of the leading visual signal was manipulated by introducing a temporal asynchrony between the audiovisual event and the collision of moving disks. Audiovisual events occurred either at the moment, before (too “early”), or after (too “late”) the disks collided on the rectangle. When asynchronies varied from trial to trial—rendering the moving disks unreliable temporal predictors of the audiovisual event—the N1 reduction was abolished. These results demonstrate that the N1 suppression is induced by visual information that both precedes and reliably predicts audiovisual onset, without a necessary link to human action-related neural mechanisms.


2016 ◽  
Vol 28 (1) ◽  
pp. 111-124 ◽  
Author(s):  
Sabrina Walter ◽  
Christian Keitel ◽  
Matthias M. Müller

Visual attention can be focused concurrently on two stimuli at noncontiguous locations while intermediate stimuli remain ignored. Nevertheless, behavioral performance in multifocal attention tasks falters when attended stimuli fall within one visual hemifield as opposed to when they are distributed across left and right hemifields. This “different-hemifield advantage” has been ascribed to largely independent processing capacities of each cerebral hemisphere in early visual cortices. Here, we investigated how this advantage influences the sustained division of spatial attention. We presented six isoeccentric light-emitting diodes (LEDs) in the lower visual field, each flickering at a different frequency. Participants attended to two LEDs that were spatially separated by an intermediate LED and responded to synchronous events at to-be-attended LEDs. Task-relevant pairs of LEDs were either located in the same hemifield (“within-hemifield” conditions) or separated by the vertical meridian (“across-hemifield” conditions). Flicker-driven brain oscillations, steady-state visual evoked potentials (SSVEPs), indexed the allocation of attention to individual LEDs. Both behavioral performance and SSVEPs indicated enhanced processing of attended LED pairs during “across-hemifield” relative to “within-hemifield” conditions. Moreover, SSVEPs demonstrated effective filtering of intermediate stimuli in “across-hemifield” condition only. Thus, despite identical physical distances between LEDs of attended pairs, the spatial profiles of gain effects differed profoundly between “across-hemifield” and “within-hemifield” conditions. These findings corroborate that early cortical visual processing stages rely on hemisphere-specific processing capacities and highlight their limiting role in the concurrent allocation of visual attention to multiple locations.


2010 ◽  
Vol 31 (10) ◽  
pp. 1772-1782 ◽  
Author(s):  
Tommi Raij ◽  
Jyrki Ahveninen ◽  
Fa-Hsuan Lin ◽  
Thomas Witzel ◽  
Iiro P. Jääskeläinen ◽  
...  

2021 ◽  
Author(s):  
Shachar Sherman ◽  
Koichi Kawakami ◽  
Herwig Baier

The brain is assembled during development by both innate and experience-dependent mechanisms1-7, but the relative contribution of these factors is poorly understood. Axons of retinal ganglion cells (RGCs) connect the eye to the brain, forming a bottleneck for the transmission of visual information to central visual areas. RGCs secrete molecules from their axons that control proliferation, differentiation and migration of downstream components7-9. Spontaneously generated waves of retinal activity, but also intense visual stimulation, can entrain responses of RGCs10 and central neurons11-16. Here we asked how the cellular composition of central targets is altered in a vertebrate brain that is depleted of retinal input throughout development. For this, we first established a molecular catalog17 and gene expression atlas18 of neuronal subpopulations in the retinorecipient areas of larval zebrafish. We then searched for changes in lakritz (atoh7-) mutants, in which RGCs do not form19. Although individual forebrain-expressed genes are dysregulated in lakritz mutants, the complete set of 77 putative neuronal cell types in thalamus, pretectum and tectum are present. While neurogenesis and differentiation trajectories are overall unaltered, a greater proportion of cells remain in an uncommitted progenitor stage in the mutant. Optogenetic stimulation of a pretectal area20,21 evokes a visual behavior in blind mutants indistinguishable from wildtype. Our analysis shows that, in this vertebrate visual system, neurons are produced more slowly, but specified and wired up in a proper configuration in the absence of any retinal signals.


2007 ◽  
Vol 1132 ◽  
pp. 158-165 ◽  
Author(s):  
D. Tomasi ◽  
L. Chang ◽  
E.C. Caparelli ◽  
T. Ernst

2011 ◽  
Vol 106 (4) ◽  
pp. 1862-1874 ◽  
Author(s):  
Jan Churan ◽  
Daniel Guitton ◽  
Christopher C. Pack

Our perception of the positions of objects in our surroundings is surprisingly unaffected by movements of the eyes, head, and body. This suggests that the brain has a mechanism for maintaining perceptual stability, based either on the spatial relationships among visible objects or internal copies of its own motor commands. Strong evidence for the latter mechanism comes from the remapping of visual receptive fields that occurs around the time of a saccade. Remapping occurs when a single neuron responds to visual stimuli placed presaccadically in the spatial location that will be occupied by its receptive field after the completion of a saccade. Although evidence for remapping has been found in many brain areas, relatively little is known about how it interacts with sensory context. This interaction is important for understanding perceptual stability more generally, as the brain may rely on extraretinal signals or visual signals to different degrees in different contexts. Here, we have studied the interaction between visual stimulation and remapping by recording from single neurons in the superior colliculus of the macaque monkey, using several different visual stimulus conditions. We find that remapping responses are highly sensitive to low-level visual signals, with the overall luminance of the visual background exerting a particularly powerful influence. Specifically, although remapping was fairly common in complete darkness, such responses were usually decreased or abolished in the presence of modest background illumination. Thus the brain might make use of a strategy that emphasizes visual landmarks over extraretinal signals whenever the former are available.


Sign in / Sign up

Export Citation Format

Share Document