scholarly journals Color and orientation are jointly coded and spatially organized in primate primary visual cortex

Science ◽  
2019 ◽  
Vol 364 (6447) ◽  
pp. 1275-1279 ◽  
Author(s):  
Anupam K. Garg ◽  
Peichao Li ◽  
Mohammad S. Rashid ◽  
Edward M. Callaway

Previous studies support the textbook model that shape and color are extracted by distinct neurons in primate primary visual cortex (V1). However, rigorous testing of this model requires sampling a larger stimulus space than previously possible. We used stable GCaMP6f expression and two-photon calcium imaging to probe a very large spatial and chromatic visual stimulus space and map functional microarchitecture of thousands of neurons with single-cell resolution. Notable proportions of V1 neurons strongly preferred equiluminant color over achromatic stimuli and were also orientation selective, indicating that orientation and color in V1 are mutually processed by overlapping circuits. Single neurons could precisely and unambiguously code for both color and orientation. Further analyses revealed systematic spatial relationships between color tuning, orientation selectivity, and cytochrome oxidase histology.

2019 ◽  
Vol 121 (6) ◽  
pp. 2202-2214 ◽  
Author(s):  
John P. McClure ◽  
Pierre-Olivier Polack

Multimodal sensory integration facilitates the generation of a unified and coherent perception of the environment. It is now well established that unimodal sensory perceptions, such as vision, are improved in multisensory contexts. Whereas multimodal integration is primarily performed by dedicated multisensory brain regions such as the association cortices or the superior colliculus, recent studies have shown that multisensory interactions also occur in primary sensory cortices. In particular, sounds were shown to modulate the responses of neurons located in layers 2/3 (L2/3) of the mouse primary visual cortex (V1). Yet, the net effect of sound modulation at the V1 population level remained unclear. In the present study, we performed two-photon calcium imaging in awake mice to compare the representation of the orientation and the direction of drifting gratings by V1 L2/3 neurons in unimodal (visual only) or multimodal (audiovisual) conditions. We found that sound modulation depended on the tuning properties (orientation and direction selectivity) and response amplitudes of V1 L2/3 neurons. Sounds potentiated the responses of neurons that were highly tuned to the cue’s orientation and direction but weakly active in the unimodal context, following the principle of inverse effectiveness of multimodal integration. Moreover, sound suppressed the responses of neurons untuned for the orientation and/or the direction of the visual cue. Altogether, sound modulation improved the representation of the orientation and direction of the visual stimulus in V1 L2/3. Namely, visual stimuli presented with auditory stimuli recruited a neuronal population better tuned to the visual stimulus orientation and direction than when presented alone. NEW & NOTEWORTHY The primary visual cortex (V1) receives direct inputs from the primary auditory cortex. Yet, the impact of sounds on visual processing in V1 remains controverted. We show that the modulation by pure tones of V1 visual responses depends on the orientation selectivity, direction selectivity, and response amplitudes of V1 neurons. Hence, audiovisual stimuli recruit a population of V1 neurons better tuned to the orientation and direction of the visual stimulus than unimodal visual stimuli.


2018 ◽  
Author(s):  
Thomas Deneux ◽  
Alexandre Kempf ◽  
Brice Bathellier

AbstractDetecting rapid coincident changes across sensory modalities is essential to recognize sudden threats and events. Using two-photon calcium imaging in identified cell types in awake mice, we show that auditory cortex (AC) neurons projecting to primary visual cortex (V1) preferentially encode the abrupt onsets of sounds. In V1, a sub-population of layer 1 interneurons gates this selective cross-modal information by a suppression specific to the absence of visual inputs. However, when large auditory onsets coincide with visual stimuli, visual responses are strongly boosted in V1. Thus, a dynamic asymmetric circuit across AC and V1 specifically identifies visual events starting simultaneously to sudden sounds, potentially catalyzing localization of new sound sources in the visual field.


2018 ◽  
Author(s):  
Jordan P. Hamm ◽  
Yuriy Shymkiv ◽  
Shuting Han ◽  
Weijian Yang ◽  
Rafael Yuste

AbstractCortical processing of sensory events is significantly influenced by context. For instance, a repetitive or redundant visual stimulus elicits attenuated cortical responses, but if the same stimulus is unexpected or “deviant”, responses are augmented. This contextual modulation of sensory processing is likely a fundamental function of neural circuits, yet an understanding of how it is computed is still missing. Using holographic two-photon calcium imaging in awake animals, here we identify three distinct, spatially intermixed ensembles of neurons in mouse primary visual cortex which differentially respond to the same stimulus under separate contexts, including a subnetwork which selectively responds to deviant events. These non-overlapping ensembles are distributed across layers 2-5, though deviance detection is more common in superficial layers. Contextual preferences likely arise locally since they are not present in bottom up inputs from the thalamus or top-down inputs from prefrontal cortex. The functional parcellation of cortical circuits into independent ensembles that encode stimulus context provides a circuit basis underlying cortically based perception of novel or redundant stimuli, a key deficit in many psychiatric disorders.One Sentence SummaryVisual cortex represents deviant and redundant stimuli with separate subnetworks.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Soumya Chatterjee ◽  
Kenichi Ohki ◽  
R. Clay Reid

AbstractThe clustering of neurons with similar response properties is a conspicuous feature of neocortex. In primary visual cortex (V1), maps of several properties like orientation preference are well described, but the functional architecture of color, central to visual perception in trichromatic primates, is not. Here we used two-photon calcium imaging in macaques to examine the fine structure of chromatic representation and found that neurons responsive to spatially uniform, chromatic stimuli form unambiguous clusters that coincide with blobs. Further, these responsive groups have marked substructure, segregating into smaller ensembles or micromaps with distinct chromatic signatures that appear columnar in upper layer 2/3. Spatially structured chromatic stimuli revealed maps built on the same micromap framework but with larger subdomains that go well beyond blobs. We conclude that V1 has an architecture for color representation that switches between blobs and a combined blob/interblob system based on the spatial content of the visual scene.


2009 ◽  
Vol 65 ◽  
pp. S172
Author(s):  
Yoshiya Mori ◽  
Koji Ikezoe ◽  
Junichi Furutaka ◽  
Kazuo Kitamura ◽  
Hiroshi Tamura ◽  
...  

2020 ◽  
Vol 12 (1) ◽  
Author(s):  
Leah B. Townsend ◽  
Kelly A. Jones ◽  
Christopher R. Dorsett ◽  
Benjamin D. Philpot ◽  
Spencer L. Smith

Abstract Background Sensory processing deficits are common in individuals with neurodevelopmental disorders. One hypothesis is that deficits may be more detectable in downstream, “higher” sensory areas. A mouse model of Angelman syndrome (AS), which lacks expression of the maternally inherited Ube3a allele, has deficits in synaptic function and experience-dependent plasticity in the primary visual cortex. Thus, we hypothesized that AS model mice have deficits in visually driven neuronal responsiveness in downstream higher visual areas (HVAs). Methods Here, we used intrinsic signal optical imaging and two-photon calcium imaging to map visually evoked neuronal activity in the primary visual cortex and HVAs in response to an array of stimuli. Results We found a highly specific deficit in HVAs. Drifting gratings that changed speed caused a strong response in HVAs in wildtype mice, but this was not observed in littermate AS model mice. Further investigation with two-photon calcium imaging revealed the effect to be largely driven by aberrant responses of inhibitory interneurons, suggesting a cellular basis for higher level, stimulus-selective cortical dysfunction in AS. Conclusion Assaying downstream, or “higher” circuitry may provide a more sensitive measure for circuit dysfunction in mouse models of neurodevelopmental disorders. Trial registration Not applicable.


2018 ◽  
Author(s):  
Stef Garasto ◽  
Anil A. Bharath ◽  
Simon R. Schultz

AbstractDeciphering the neural code, that is interpreting the responses of sensory neurons from the perspective of a downstream population, is an important step towards understanding how the brain processes sensory stimulation. While previous work has focused on classification algorithms to identify the most likely stimulus label in a predefined set of categories, fewer studies have approached a full stimulus reconstruction task. Outstanding questions revolve around the type of algorithm that is most suited to decoding (i.e. full reconstruction, in the context of this study), especially in the presence of strong encoding non-linearities, and the possible role of pairwise correlations. We present, here, the first pixel-by-pixel reconstruction of a complex natural stimulus from 2-photon calcium imaging responses of mouse primary visual cortex (V1). We decoded the activity of approximately 100 neurons from layer 2/3 using an optimal linear estimator and an artificial neural network. We also investigated how much accuracy is lost in this decoding operation when ignoring pairwise neural correlations. We found that a simple linear estimator is sufficient to extract relevant stimulus features from the neural responses, and that it was not significantly outperformed by a non-linear decoding algorithm. The importance of pairwise correlations for reconstruction accuracy was also limited. The results of this study suggest that, conditional on the spatial and temporal limits of the recording technique, V1 neurons display linear readout properties, with low information content in the joint distribution of their activity.


2020 ◽  
Author(s):  
Rune N. Rasmussen ◽  
Akihiro Matsumoto ◽  
Simon Arvin ◽  
Keisuke Yonehara

AbstractLocomotion creates various patterns of optic flow on the retina, which provide the observer with information about their movement relative to the environment. However, it is unclear how these optic flow patterns are encoded by the cortex. Here we use two-photon calcium imaging in awake mice to systematically map monocular and binocular responses to horizontal motion in four areas of the visual cortex. We find that neurons selective to translational or rotational optic flow are abundant in higher visual areas, whereas neurons suppressed by binocular motion are more common in the primary visual cortex. Disruption of retinal direction selectivity in Frmd7 mutant mice reduces the number of translation-selective neurons in the primary visual cortex, and translation- and rotation-selective neurons as well as binocular direction-selective neurons in the rostrolateral and anterior visual cortex, blurring the functional distinction between primary and higher visual areas. Thus, optic flow representations in specific areas of the visual cortex rely on binocular integration of motion information from the retina.


Sign in / Sign up

Export Citation Format

Share Document