scholarly journals A cortical circuit for audio-visual predictions

2020 ◽  
Author(s):  
Aleena R. Garner ◽  
Georg B. Keller

ABSTRACTLearned associations between stimuli in different sensory modalities can shape the way we perceive these stimuli (Mcgurk and Macdonald, 1976). During audio-visual associative learning, auditory cortex is thought to underlie multi-modal plasticity in visual cortex (McIntosh et al., 1998; Mishra et al., 2007; Zangenehpour and Zatorre, 2010). However, it is not well understood how processing in visual cortex is altered by an auditory stimulus that is predictive of a visual stimulus and what the mechanisms are that mediate such experience-dependent, audio-visual associations in sensory cortex. Here we describe a neural mechanism by which an auditory input can shape visual representations of behaviorally relevant stimuli through direct interactions between auditory and visual cortices. We show that the association of an auditory stimulus with a visual stimulus in a behaviorally relevant context leads to an experience-dependent suppression of visual responses in primary visual cortex (V1). Auditory cortex axons carry a mixture of auditory and retinotopically-matched visual input to V1, and optogenetic stimulation of these axons selectively suppresses V1 neurons responsive to the associated visual stimulus after, but not before, learning. Our results suggest that cross-modal associations can be stored in long-range cortical connections and that with learning these cross-modal connections function to suppress the responses to predictable input.

Author(s):  
Aleena R. Garner ◽  
Georg B. Keller

AbstractLearned associations between stimuli in different sensory modalities can shape the way we perceive these stimuli. However, it is not well understood how these interactions are mediated or at what level of the processing hierarchy they occur. Here we describe a neural mechanism by which an auditory input can shape visual representations of behaviorally relevant stimuli through direct interactions between auditory and visual cortices in mice. We show that the association of an auditory stimulus with a visual stimulus in a behaviorally relevant context leads to experience-dependent suppression of visual responses in primary visual cortex (V1). Auditory cortex axons carry a mixture of auditory and retinotopically matched visual input to V1, and optogenetic stimulation of these axons selectively suppresses V1 neurons that are responsive to the associated visual stimulus after, but not before, learning. Our results suggest that cross-modal associations can be communicated by long-range cortical connections and that, with learning, these cross-modal connections function to suppress responses to predictable input.


2019 ◽  
Vol 121 (6) ◽  
pp. 2202-2214 ◽  
Author(s):  
John P. McClure ◽  
Pierre-Olivier Polack

Multimodal sensory integration facilitates the generation of a unified and coherent perception of the environment. It is now well established that unimodal sensory perceptions, such as vision, are improved in multisensory contexts. Whereas multimodal integration is primarily performed by dedicated multisensory brain regions such as the association cortices or the superior colliculus, recent studies have shown that multisensory interactions also occur in primary sensory cortices. In particular, sounds were shown to modulate the responses of neurons located in layers 2/3 (L2/3) of the mouse primary visual cortex (V1). Yet, the net effect of sound modulation at the V1 population level remained unclear. In the present study, we performed two-photon calcium imaging in awake mice to compare the representation of the orientation and the direction of drifting gratings by V1 L2/3 neurons in unimodal (visual only) or multimodal (audiovisual) conditions. We found that sound modulation depended on the tuning properties (orientation and direction selectivity) and response amplitudes of V1 L2/3 neurons. Sounds potentiated the responses of neurons that were highly tuned to the cue’s orientation and direction but weakly active in the unimodal context, following the principle of inverse effectiveness of multimodal integration. Moreover, sound suppressed the responses of neurons untuned for the orientation and/or the direction of the visual cue. Altogether, sound modulation improved the representation of the orientation and direction of the visual stimulus in V1 L2/3. Namely, visual stimuli presented with auditory stimuli recruited a neuronal population better tuned to the visual stimulus orientation and direction than when presented alone. NEW & NOTEWORTHY The primary visual cortex (V1) receives direct inputs from the primary auditory cortex. Yet, the impact of sounds on visual processing in V1 remains controverted. We show that the modulation by pure tones of V1 visual responses depends on the orientation selectivity, direction selectivity, and response amplitudes of V1 neurons. Hence, audiovisual stimuli recruit a population of V1 neurons better tuned to the orientation and direction of the visual stimulus than unimodal visual stimuli.


2018 ◽  
Author(s):  
Thomas Deneux ◽  
Alexandre Kempf ◽  
Brice Bathellier

AbstractDetecting rapid coincident changes across sensory modalities is essential to recognize sudden threats and events. Using two-photon calcium imaging in identified cell types in awake mice, we show that auditory cortex (AC) neurons projecting to primary visual cortex (V1) preferentially encode the abrupt onsets of sounds. In V1, a sub-population of layer 1 interneurons gates this selective cross-modal information by a suppression specific to the absence of visual inputs. However, when large auditory onsets coincide with visual stimuli, visual responses are strongly boosted in V1. Thus, a dynamic asymmetric circuit across AC and V1 specifically identifies visual events starting simultaneously to sudden sounds, potentially catalyzing localization of new sound sources in the visual field.


2019 ◽  
Author(s):  
Stefania Ferraro ◽  
Markus J. Van Ackeren ◽  
Roberto Mai ◽  
Laura Tassi ◽  
Francesco Cardinale ◽  
...  

AbstractUnequivocally demonstrating the presence of multisensory signals at the earliest stages of cortical processing remains challenging in humans. In our study, we relied on the unique spatio-temporal resolution provided by intracranial stereotactic electroencephalographic (SEEG) recordings in patients with drug-resistant epilepsy to characterize the signal extracted from early visual (calcarine and pericalcarine) and auditory (Heschl’s gyrus and planum temporale) regions during a simple audio-visual oddball task. We provide evidences that both cross-modal responses (visual responses in auditory cortex or the reverse) and multisensory processing (alteration of the unimodal responses during bimodal stimulation) can be observed in intracranial event-related potentials (iERPs) and in power modulations of oscillatory activity at different temporal scales within the first 150 ms after stimulus onset. The temporal profiles of the iERPs are compatible with the hypothesis that MSI occurs by means of direct pathways linking early visual and auditory regions. Our data indicate, moreover, that MSI mainly relies on modulations of the low-frequency bands (foremost the theta band in the auditory cortex and the alpha band in the visual cortex), suggesting the involvement of feedback pathways between the two sensory regions. Remarkably, we also observed high-gamma power modulations by sounds in the early visual cortex, thus suggesting the presence of neuronal populations involved in auditory processing in the calcarine and pericalcarine region in humans.


2004 ◽  
Vol 16 (2) ◽  
pp. 204-218 ◽  
Author(s):  
Antony B. Morland ◽  
Sandra Lê ◽  
Erin Carroll ◽  
Michael B. Hoffmann ◽  
Alidz Pambakian

Some patients, who are rendered perimetrically blind in one hemifield by cortical lesions, nevertheless exhibit residual visual capacities within their field defects. The neural mechanism that mediates the residual visual responses has remained the topic of considerable debate. One explanation posits the subcortical visual pathways that bypass the primary visual cortex and innervate the extrastriate visual areas as the substrate that underlies the residual vision. The other explanation is that small islands of the primary visual cortex remain intact and provide the signals for residual vision. We have performed behavioral and functional magnetic resonance imaging experiments to investigate the validity of the two explanations of residual vision. Our behavioral experiments indicated that of the seven hemianopes tested, two had the ability to discriminate the direction of a drifting grating. This residual visual response was shown with fMRI to be the result of spared islands of calcarine cortical activity in one of the hemianopes, whereas only lateral occipital activity was documented in the other patient. These results indicate that the underlying neural correlates of residual vision can vary between patients. Moreover, our study emphasizes the necessity of ruling out the presence of islands of preserved function and primary visual cortex before assigning residual visual capacities to the properties of visual pathways that bypass the primary visual cortex.


2017 ◽  
Vol 114 (22) ◽  
pp. E4501-E4510 ◽  
Author(s):  
Job van den Hurk ◽  
Marc Van Baelen ◽  
Hans P. Op de Beeck

To what extent does functional brain organization rely on sensory input? Here, we show that for the penultimate visual-processing region, ventral-temporal cortex (VTC), visual experience is not the origin of its fundamental organizational property, category selectivity. In the fMRI study reported here, we presented 14 congenitally blind participants with face-, body-, scene-, and object-related natural sounds and presented 20 healthy controls with both auditory and visual stimuli from these categories. Using macroanatomical alignment, response mapping, and surface-based multivoxel pattern analysis, we demonstrated that VTC in blind individuals shows robust discriminatory responses elicited by the four categories and that these patterns of activity in blind subjects could successfully predict the visual categories in sighted controls. These findings were confirmed in a subset of blind participants born without eyes and thus deprived from all light perception since conception. The sounds also could be decoded in primary visual and primary auditory cortex, but these regions did not sustain generalization across modalities. Surprisingly, although not as strong as visual responses, selectivity for auditory stimulation in visual cortex was stronger in blind individuals than in controls. The opposite was observed in primary auditory cortex. Overall, we demonstrated a striking similarity in the cortical response layout of VTC in blind individuals and sighted controls, demonstrating that the overall category-selective map in extrastriate cortex develops independently from visual experience.


2019 ◽  
Author(s):  
Dechen Liu ◽  
Juan Deng ◽  
Zhewei Zhang ◽  
Zhi-Yu Zhang ◽  
Yan-Gang Sun ◽  
...  

AbstractThe orbitofrontal cortex (OFC) encodes expected outcomes and plays a critical role in flexible, outcome-guided behavior. The OFC projects to primary visual cortex (V1), yet the function of this top-down projection is unclear. We find that optogenetic activation of OFC projection to V1 reduces the amplitude of V1 visual responses via the recruitment of local somatostatin-expressing (SST) interneurons. Using mice performing a Go/No-Go visual task, we show that the OFC projection to V1 mediates the outcome-expectancy modulation of V1 responses to the reward-irrelevant No-Go stimulus. Furthermore, V1-projecting OFC neurons reduce firing during expectation of reward. In addition, chronic optogenetic inactivation of OFC projection to V1 impairs, whereas chronic activation of SST interneurons in V1 improves the learning of Go/No-Go visual task, without affecting the immediate performance. Thus, OFC top-down projection to V1 is crucial to drive visual associative learning by modulating the response gain of V1 neurons to non-relevant stimulus.


2017 ◽  
Author(s):  
Aman B. Saleem ◽  
E. Mika Diamanti ◽  
Julien Fournier ◽  
Kenneth D. Harris ◽  
Matteo Carandini

A major role of vision is to guide navigation, and navigation is strongly driven by vision1-4. Indeed, the brain’s visual and navigational systems are known to interact5, 6, and signals related to position in the environment have been suggested to appear as early as in visual cortex6, 7. To establish the nature of these signals we recorded in primary visual cortex (V1) and in the CA1 region of the hippocampus while mice traversed a corridor in virtual reality. The corridor contained identical visual landmarks in two positions, so that a purely visual neuron would respond similarly in those positions. Most V1 neurons, however, responded solely or more strongly to the landmarks in one position. This modulation of visual responses by spatial location was not explained by factors such as running speed. To assess whether the modulation is related to navigational signals and to the animal’s subjective estimate of position, we trained the mice to lick for a water reward upon reaching a reward zone in the corridor. Neuronal populations in both CA1 and V1 encoded the animal’s position along the corridor, and the errors in their representations were correlated. Moreover, both representations reflected the animal’s subjective estimate of position, inferred from the animal’s licks, better than its actual position. Indeed, when animals licked in a given location – whether correct or incorrect – neural populations in both V1 and CA1 placed the animal in the reward zone. We conclude that visual responses in V1 are tightly controlled by navigational signals, which are coherent with those encoded in hippocampus, and reflect the animal’s subjective position in the environment. The presence of such navigational signals as early as in a primary sensory area suggests that these signals permeate sensory processing in the cortex.


2020 ◽  
Author(s):  
Stewart Heitmann ◽  
G. Bard Ermentrout

AbstractThe majority of neurons in primary visual cortex respond selectively to bars of light that have a specific orientation and move in a specific direction. The spatial and temporal responses of such neurons are non-separable. How neurons accomplish that computational feat without resort to explicit time delays is unknown. We propose a novel neural mechanism whereby visual cortex computes non-separable responses by generating endogenous traveling waves of neural activity that resonate with the space-time signature of the visual stimulus. The spatiotemporal characteristics of the response are defined by the local topology of excitatory and inhibitory lateral connections in the cortex. We simulated the interaction between endogenous traveling waves and the visual stimulus using spatially distributed populations of excitatory and inhibitory neurons with Wilson-Cowan dynamics and inhibitory-surround coupling. Our model reliably detected visual gratings that moved with a given speed and direction provided that we incorporated neural competition to suppress false motion signals in the opposite direction. The findings suggest that endogenous traveling waves in visual cortex can impart direction-selectivity on neural responses without resort to explicit time delays. They also suggest a functional role for motion opponency in eliminating false motion signals.Author summaryIt is well established that the so-called ‘simple cells’ of the primary visual cortex respond preferentially to oriented bars of light that move across the visual field with a particular speed and direction. The spatiotemporal responses of such neurons are said to be non-separable because they cannot be constructed from independent spatial and temporal neural mechanisms. Contemporary theories of how neurons compute non-separable responses typically rely on finely tuned transmission delays between signals from disparate regions of the visual field. However the existence of such delays is controversial. We propose an alternative neural mechanism for computing non-separable responses that does not require transmission delays. It instead relies on the predisposition of the cortical tissue to spontaneously generate spatiotemporal waves of neural activity that travel with a particular speed and direction. We propose that the endogenous wave activity resonates with the visual stimulus to elicit direction-selective neural responses to visual motion. We demonstrate the principle in computer models and show that competition between opposing neurons robustly enhances their ability to discriminate between visual gratings that move in opposite directions.


eLife ◽  
2019 ◽  
Vol 8 ◽  
Author(s):  
Thomas Deneux ◽  
Evan R Harrell ◽  
Alexandre Kempf ◽  
Sebastian Ceballo ◽  
Anton Filipchuk ◽  
...  

Detecting rapid, coincident changes across sensory modalities is essential for recognition of sudden threats or events. Using two-photon calcium imaging in identified cell types in awake, head-fixed mice, we show that, among the basic features of a sound envelope, loud sound onsets are a dominant feature coded by the auditory cortex neurons projecting to primary visual cortex (V1). In V1, a small number of layer 1 interneurons gates this cross-modal information flow in a context-dependent manner. In dark conditions, auditory cortex inputs lead to suppression of the V1 population. However, when sound input coincides with a visual stimulus, visual responses are boosted in V1, most strongly after loud sound onsets. Thus, a dynamic, asymmetric circuit connecting AC and V1 contributes to the encoding of visual events that are coincident with sounds.


Sign in / Sign up

Export Citation Format

Share Document