inferotemporal cortex
Recently Published Documents


TOTAL DOCUMENTS

270
(FIVE YEARS 15)

H-INDEX

52
(FIVE YEARS 3)

2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Olivia Rose ◽  
James Johnson ◽  
Binxu Wang ◽  
Carlos R. Ponce

AbstractEarly theories of efficient coding suggested the visual system could compress the world by learning to represent features where information was concentrated, such as contours. This view was validated by the discovery that neurons in posterior visual cortex respond to edges and curvature. Still, it remains unclear what other information-rich features are encoded by neurons in more anterior cortical regions (e.g., inferotemporal cortex). Here, we use a generative deep neural network to synthesize images guided by neuronal responses from across the visuocortical hierarchy, using floating microelectrode arrays in areas V1, V4 and inferotemporal cortex of two macaque monkeys. We hypothesize these images (“prototypes”) represent such predicted information-rich features. Prototypes vary across areas, show moderate complexity, and resemble salient visual attributes and semantic content of natural images, as indicated by the animals’ gaze behavior. This suggests the code for object recognition represents compressed features of behavioral relevance, an underexplored aspect of efficient coding.


2020 ◽  
Vol 117 (51) ◽  
pp. 32667-32678
Author(s):  
Michael J. Arcaro ◽  
Theodora Mautz ◽  
Vladimir K. Berezovskii ◽  
Margaret S. Livingstone

Primate brains typically have regions within the ventral visual stream that are selectively responsive to faces. In macaques, these face patches are located in similar parts of inferotemporal cortex across individuals although correspondence with particular anatomical features has not been reported previously. Here, using high-resolution functional and anatomical imaging, we show that small “bumps,” or buried gyri, along the lower bank of the superior temporal sulcus are predictive of the location of face-selective regions. Recordings from implanted multielectrode arrays verified that these bumps contain face-selective neurons. These bumps were present in monkeys raised without seeing faces and that lack face patches, indicating that these anatomical landmarks are predictive of, but not sufficient for, the presence of face selectivity. These bumps are found across primate species that span taxonomy lines, indicating common evolutionary developmental mechanisms. The bumps emerge during fetal development in macaques, indicating that they arise from general developmental mechanisms that result in the regularity of cortical folding of the entire brain.


Nature ◽  
2020 ◽  
Vol 583 (7814) ◽  
pp. 103-108 ◽  
Author(s):  
Pinglei Bao ◽  
Liang She ◽  
Mason McGill ◽  
Doris Y. Tsao

2019 ◽  
Vol 122 (6) ◽  
pp. 2522-2540 ◽  
Author(s):  
Noam Roth ◽  
Nicole C. Rust

Searching for a specific visual object requires our brain to compare the items in view with a remembered representation of the sought target to determine whether a target match is present. This comparison is thought to be implemented, in part, via the combination of top-down modulations reflecting target identity with feed-forward visual representations. However, it remains unclear whether top-down signals are integrated at a single locus within the ventral visual pathway (e.g., V4) or at multiple stages [e.g., both V4 and inferotemporal cortex (IT)]. To investigate, we recorded neural responses in V4 and IT as rhesus monkeys performed a task that required them to identify when a target object appeared across variation in position, size, and background context. We found nonvisual, task-specific signals in both V4 and IT. To evaluate whether V4 was the only locus for the integration of top-down signals, we evaluated several feed-forward accounts of processing from V4 to IT, including a model in which IT preferentially sampled from the best V4 units and a model that allowed for nonlinear IT computation. IT task-specific modulation was not accounted for by any of these feed-forward descriptions, suggesting that during object search, top-down signals are integrated directly within IT. NEW & NOTEWORTHY To find specific objects, the brain must integrate top-down, target-specific signals with visual information about objects in view. However, the exact route of this integration in the ventral visual pathway is unclear. In the first study to systematically compare V4 and inferotemporal cortex (IT) during an invariant object search task, we demonstrate that top-down signals found in IT cannot be described as being inherited from V4 but rather must be integrated directly within IT itself.


2019 ◽  
Vol 116 (47) ◽  
pp. 23797-23805 ◽  
Author(s):  
Heiko Stemmann ◽  
Winrich A. Freiwald

From incoming sensory information, our brains make selections according to current behavioral goals. This process, selective attention, is controlled by parietal and frontal areas. Here, we show that another brain area, posterior inferotemporal cortex (PITd), also exhibits the defining properties of attentional control. We discovered this area with functional magnetic resonance imaging (fMRI) during an attentive motion discrimination task. Single-cell recordings from PITd revealed strong attentional modulation across 3 attention tasks yet no tuning to task-relevant stimulus features, like motion direction or color. Instead, PITd neurons closely tracked the subject’s attention state and predicted upcoming errors of attentional selection. Furthermore, artificial electrical PITd stimulation controlled the location of attentional selection without altering feature discrimination. These are the defining properties of a feature-blind priority map encoding the locus of attention. Together, these results suggest area PITd, located strategically to gather information about object properties, as an attentional priority map.


2019 ◽  
Vol 19 (10) ◽  
pp. 249
Author(s):  
Pinglei Bao ◽  
Liang She ◽  
Doris Y. Tsao

2019 ◽  
Vol 19 (10) ◽  
pp. 91c
Author(s):  
Vahid Mehrpour ◽  
Yalda Mohsenzadeh ◽  
Andrew Jaegle ◽  
Travis Meyer ◽  
Aude Oliva ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document