scholarly journals Neural Substrates of Visual Perception and Working Memory: Two Sides of the Same Coin or Two Different Coins?

2021 ◽  
Vol 15 ◽  
Author(s):  
Megan Roussy ◽  
Diego Mendoza-Halliday ◽  
Julio C. Martinez-Trujillo

Visual perception occurs when a set of physical signals emanating from the environment enter the visual system and the brain interprets such signals as a percept. Visual working memory occurs when the brain produces and maintains a mental representation of a percept while the physical signals corresponding to that percept are not available. Early studies in humans and non-human primates demonstrated that lesions of the prefrontal cortex impair performance during visual working memory tasks but not during perceptual tasks. These studies attributed a fundamental role in working memory and a lesser role in visual perception to the prefrontal cortex. Indeed, single cell recording studies have found that neurons in the lateral prefrontal cortex of macaques encode working memory representations via persistent firing, validating the results of lesion studies. However, other studies have reported that neurons in some areas of the parietal and temporal lobe—classically associated with visual perception—similarly encode working memory representations via persistent firing. This prompted a line of enquiry about the role of the prefrontal and other associative cortices in working memory and perception. Here, we review evidence from single neuron studies in macaque monkeys examining working memory representations across different areas of the visual hierarchy and link them to studies examining the role of the same areas in visual perception. We conclude that neurons in early visual areas of both ventral (V1-V2-V4) and dorsal (V1-V3-MT) visual pathways of macaques mainly encode perceptual signals. On the other hand, areas downstream from V4 and MT contain subpopulations of neurons that encode both perceptual and/or working memory signals. Differences in cortical architecture (neuronal types, layer composition, and synaptic density and distribution) may be linked to the differential encoding of perceptual and working memory signals between early visual areas and higher association areas.

2017 ◽  
Vol 114 (43) ◽  
pp. E9115-E9124 ◽  
Author(s):  
Stephanie Ding ◽  
Christopher J. Cueva ◽  
Misha Tsodyks ◽  
Ning Qian

When a stimulus is presented, its encoding is known to progress from low- to high-level features. How these features are decoded to produce perception is less clear, and most models assume that decoding follows the same low- to high-level hierarchy of encoding. There are also theories arguing for global precedence, reversed hierarchy, or bidirectional processing, but they are descriptive without quantitative comparison with human perception. Moreover, observers often inspect different parts of a scene sequentially to form overall perception, suggesting that perceptual decoding requires working memory, yet few models consider how working-memory properties may affect decoding hierarchy. We probed decoding hierarchy by comparing absolute judgments of single orientations and relative/ordinal judgments between two sequentially presented orientations. We found that lower-level, absolute judgments failed to account for higher-level, relative/ordinal judgments. However, when ordinal judgment was used to retrospectively decode memory representations of absolute orientations, striking aspects of absolute judgments, including the correlation and forward/backward aftereffects between two reported orientations in a trial, were explained. We propose that the brain prioritizes decoding of higher-level features because they are more behaviorally relevant, and more invariant and categorical, and thus easier to specify and maintain in noisy working memory, and that more reliable higher-level decoding constrains less reliable lower-level decoding.


Sign in / Sign up

Export Citation Format

Share Document