Visual perception of different wood surfaces: an event-related potentials study

2021 ◽  
Vol 78 (2) ◽  
Author(s):  
Qian Wan ◽  
Xiaohe Li ◽  
Yachi Zhang ◽  
Shasha Song ◽  
Qing Ke
2021 ◽  
Author(s):  
Sharif I. Kronemer ◽  
Mark Aksen ◽  
Julia Ding ◽  
Jun Hwan Ryu ◽  
Qilong Xin ◽  
...  

AbstractConsciousness is not explained by a single mechanism, rather it involves multiple specialized neural systems overlapping in space and time. We hypothesize that synergistic, large-scale subcortical and cortical attention and signal processing networks encode conscious experiences. To identify brain activity in conscious perception without overt report, we classified visual stimuli as perceived or not using eye measurements. Report-independent event-related potentials and functional magnetic resonance imaging (fMRI) signals both occurred at early times after stimuli. Direct recordings revealed a novel thalamic awareness potential linked to conscious visual perception based on report. fMRI showed thalamic and cortical detection, arousal, attentional salience, task-positive, and default mode networks were involved independent of overt report. These findings identify a specific sequence of neural mechanisms in human conscious visual perception.One-Sentence SummaryHuman conscious visual perception engages large-scale subcortical and cortical networks even without overt report.


2021 ◽  
Author(s):  
Roman Vakhrushev ◽  
Felicia Cheng ◽  
Annekathrin Schacht ◽  
Arezoo Pooresmaeili

Stimuli associated with high reward modulate perception and such value-driven effects have been shown to originate from the modulation of the earliest stages of sensory processing in the brain. In natural environments objects comprise multiple features (imagine a rolling soccer ball, with its black and white patches and the swishing sound made during its motion), where each feature may signal different associations with previously encountered rewards. How perception of such an object is affected by the value associations of its constituent parts is unknown. The present study compares intra- and cross-modal value-driven effects on behavioral and electrophysiological correlates of visual perception. Human participants first learned the reward associations of visual and auditory cues. Subsequently, they performed a visual orientation discrimination task in the presence of previously rewarded visual or auditory cues (intra- and cross-modal cues, respectively) that were concurrently presented with the target stimulus. During the conditioning phase, when reward associations were learned and reward cues were the target of the task, reward value of both modalities enhanced the electrophysiological correlates of sensory processing in visual cortex. During the post-conditioning phase, when reward delivery was halted and previously rewarded stimuli were task-irrelevant, cross-modal value-enhanced behavioral measures of visual sensitivity whereas intra-modal value led to a trend for suppression. A similar pattern of modulations was found in the simultaneously recorded event-related potentials (ERPs) of posterior electrodes. We found an early (90-120 ms) suppression of ERPs evoked by high-value, intra-modal stimuli. Cross-modal cues led to a later value-driven modulation, with an enhancement of response positivity for high- compared to low-value stimuli starting at the N1 window (180-250 ms) and extending to the P3 (300-600 ms) responses of the posterior electrodes. These results indicate that visual cortex is modulated by the reward value of visual as well as auditory cues. Previously rewarded, task-irrelevant cues from the same or different sensory modality have a different effect on visual perception, as intra-modal high-value cues may interfere with the target processing, whereas cross-modal high-value cues boost the perception of the target.


F1000Research ◽  
2020 ◽  
Vol 9 ◽  
pp. 1010
Author(s):  
Pavel N. Ermakov ◽  
Elena V. Vorobyeva ◽  
Ekaterina M. Kovsh ◽  
Alexander S. Stoletniy ◽  
Magomed M. Dalgatov ◽  
...  

Background: The aim of this paper is to investigate the associations between polymorphisms in the BDNF, COMT, and HTR2A genes with peculiarity of visual perception. In particular, how the carriers of different genotypes of Indicated genes emotionally evaluating visual scenes with distinct second-order features (images modulated by contrast) and how corresponding process is reflected in event-related brain activity (ERP). Methods: The study involved students who underwent PCR-based genetic analysis with the release of BDNF, COMT, and HTR2A genotypes. Participants were asked to emotionally assesse the specific stimuli – visual scenes that were generated from contrast modulations. At the same time the EEG were recorded using a 128-electrodes system. The average frequency of responses and ERPs for different emotional evaluations (negative, neutral and positive) were analyzed. Results: The study showed the BDNF Val/Val polymorphism was associated with the increase in the P2 amplitude in the occipital regions compared to the Val/Met genotype regardless of emotional evaluation. The COMT Met/Met genotype polymorphism associated with the increase of N170 negativity in the occipital regions during evaluation task. The HTR2A polymorphism A/A associated with increase in the P1 amplitude when positive or negative assessment were chosen, and decrease of later positive peak when neutral evaluation was chosen. Conclusions: The results suggested that emotional evaluation and recognition of visual scenes with distinct second-order features, as well as various strategies for processing visual information, reflected in amplitude and latency of different ERPs components and associated with the different genotypes of BDNF, COMT, and HTR2A genes. The indicated interconnections can act as genetic basis of individualize the mechanisms of visual perception.


2007 ◽  
Author(s):  
Lars T. Boenke ◽  
Frank W. Ohl ◽  
Andrey R. Nikolaev ◽  
Thomas Lachmann ◽  
Cees van Leeuwen

1999 ◽  
Vol 22 (3) ◽  
pp. 341-365 ◽  
Author(s):  
Zenon Pylyshyn

Although the study of visual perception has made more progress in the past 40 years than any other area of cognitive science, there remain major disagreements as to how closely vision is tied to cognition. This target article sets out some of the arguments for both sides (arguments from computer vision, neuroscience, psychophysics, perceptual learning, and other areas of vision science) and defends the position that an important part of visual perception, corresponding to what some people have called early vision, is prohibited from accessing relevant expectations, knowledge, and utilities in determining the function it computes – in other words, it is cognitively impenetrable. That part of vision is complex and involves top-down interactions that are internal to the early vision system. Its function is to provide a structured representation of the 3-D surfaces of objects sufficient to serve as an index into memory, with somewhat different outputs being made available to other systems such as those dealing with motor control. The paper also addresses certain conceptual and methodological issues raised by this claim, such as whether signal detection theory and event-related potentials can be used to assess cognitive penetration of vision.A distinction is made among several stages in visual processing, including, in addition to the inflexible early-vision stage, a pre-perceptual attention-allocation stage and a post-perceptual evaluation, selection, and inference stage, which accesses long-term memory. These two stages provide the primary ways in which cognition can affect the outcome of visual perception. The paper discusses arguments from computer vision and psychology showing that vision is “intelligent” and involves elements of “problem solving.” The cases of apparently intelligent interpretation sometimes cited in support of this claim do not show cognitive penetration; rather, they show that certain natural constraints on interpretation, concerned primarily with optical and geometrical properties of the world, have been compiled into the visual system. The paper also examines a number of examples where instructions and “hints” are alleged to affect what is seen. In each case it is concluded that the evidence is more readily assimilated to the view that when cognitive effects are found, they have a locus outside early vision, in such processes as the allocation of focal attention and the identification of the stimulus.


Sign in / Sign up

Export Citation Format

Share Document