scholarly journals Binocular integration of perceptually suppressed visual information in amblyopia

Author(s):  
Amy Chow ◽  
Andrew E. Silva ◽  
Katelyn Tsang ◽  
Gabriel Ng ◽  
Cindy Ho ◽  
...  

Abnormal visual experience during an early critical period of visual cortex development can lead to a neurodevelopmental disorder of vision called amblyopia. A key feature of amblyopia is interocular suppression, whereby information from the amblyopic eye is blocked from conscious awareness when both eyes are open. Suppression of the amblyopic eye is thought to occur at an early stage of visual processing and to be absolute. Using a binocular rivalry paradigm, we demonstrate that suppressed visual information from the amblyopic eye remains available for binocular integration and can influence overall perception of stimuli. This finding reveals that suppressed visual information continues to be represented within the brain even when it is blocked from conscious awareness by chronic pathological suppression. These results have direct implications for the clinical management of amblyopia.

2021 ◽  
Author(s):  
Kimberly Reinhold ◽  
Arbora Resulaj ◽  
Massimo Scanziani

The behavioral state of a mammal impacts how the brain responds to visual stimuli as early as in the dorsolateral geniculate nucleus of the thalamus (dLGN), the primary relay of visual information to the cortex. A clear example of this is the markedly stronger response of dLGN neurons to higher temporal frequencies of the visual stimulus in alert as compared to quiescent animals. The dLGN receives strong feedback from the visual cortex, yet whether this feedback contributes to these state-dependent responses to visual stimuli is poorly understood. Here we show that in mice, silencing cortico-thalamic feedback abolishes state-dependent differences in the response of dLGN neurons to visual stimuli. This holds true for dLGN responses to both temporal and spatial features of the visual stimulus. These results reveal that the state-dependent shift of the response to visual stimuli in an early stage of visual processing depends on cortico-thalamic feedback.


2011 ◽  
Vol 23 (12) ◽  
pp. 3734-3745 ◽  
Author(s):  
Jacob Jolij ◽  
H. Steven Scholte ◽  
Simon van Gaal ◽  
Timothy L. Hodgson ◽  
Victor A. F. Lamme

Humans largely guide their behavior by their visual representation of the world. Recent studies have shown that visual information can trigger behavior within 150 msec, suggesting that visually guided responses to external events, in fact, precede conscious awareness of those events. However, is such a view correct? By using a texture discrimination task, we show that the brain relies on long-latency visual processing in order to guide perceptual decisions. Decreasing stimulus saliency leads to selective changes in long-latency visually evoked potential components reflecting scene segmentation. These latency changes are accompanied by almost equal changes in simple RTs and points of subjective simultaneity. Furthermore, we find a strong correlation between individual RTs and the latencies of scene segmentation related components in the visually evoked potentials, showing that the processes underlying these late brain potentials are critical in triggering a response. However, using the same texture stimuli in an antisaccade task, we found that reflexive, but erroneous, prosaccades, but not antisaccades, can be triggered by earlier visual processes. In other words: The brain can act quickly, but decides late. Differences between our study and earlier findings suggesting that action precedes conscious awareness can be explained by assuming that task demands determine whether a fast and unconscious, or a slower and conscious, representation is used to initiate a visually guided response.


Author(s):  
Martin V. Butz ◽  
Esther F. Kutter

While bottom-up visual processing is important, the brain integrates this information with top-down, generative expectations from very early on in the visual processing hierarchy. Indeed, our brain should not be viewed as a classification system, but rather as a generative system, which perceives something by integrating sensory evidence with the available, learned, predictive knowledge about that thing. The involved generative models continuously produce expectations over time, across space, and from abstracted encodings to more concrete encodings. Bayesian information processing is the key to understand how information integration must work computationally – at least in approximation – also in the brain. Bayesian networks in the form of graphical models allow the modularization of information and the factorization of interactions, which can strongly improve the efficiency of generative models. The resulting generative models essentially produce state estimations in the form of probability densities, which are very well-suited to integrate multiple sources of information, including top-down and bottom-up ones. A hierarchical neural visual processing architecture illustrates this point even further. Finally, some well-known visual illusions are shown and the perceptions are explained by means of generative, information integrating, perceptual processes, which in all cases combine top-down prior knowledge and expectations about objects and environments with the available, bottom-up visual information.


2020 ◽  
Author(s):  
Wen Wen ◽  
Yue Wang ◽  
Sheng He ◽  
Hong Liu ◽  
Chen Zhao ◽  
...  

Abnormal visual experience during critical period leads to reorganization of neuroarchitectures in primate visual cortex. However, developmental plasticity of human subcortical visual pathways remains elusive. Using high-resolution fMRI and pathway-selective visual stimuli, we investigated layer-dependent response properties and connectivity of subcortical visual pathways of adult human amblyopia. Stimuli presented to the amblyopic eye showed selective response loss in the parvocellular layers of the lateral geniculate nucleus, and also reduced the connectivity to V1. Amblyopic eye's response to isoluminant chromatic stimulus was significantly reduced in the superficial layers of the superior colliculus, while the fellow eye's response robustly increased in the deeper layers associated with increased cortical feedbacks. Therefore, amblyopia led to selective reduction of parvocellular feedforward signals in the geniculostriate pathway, whereas loss and enhancement of parvocellular feedback signals in the retinotectal pathway. These findings shed light for future development of new tools for treating amblyopia and tracking the prognosis.


Author(s):  
Martin V. Butz ◽  
Esther F. Kutter

This chapter addresses primary visual perception, detailing how visual information comes about and, as a consequence, which visual properties provide particularly useful information about the environment. The brain extracts this information systematically, and also separates redundant and complementary visual information aspects to improve the effectiveness of visual processing. Computationally, image smoothing, edge detectors, and motion detectors must be at work. These need to be applied in a convolutional manner over the fixated area, which are computations that are predestined to be solved by means of cortical columnar structures in the brain. On the next level, the extracted information needs to be integrated to be able to segment and detect object structures. The brain solves this highly challenging problem by incorporating top-down expectations and by integrating complementary visual information aspects, such as light reflections, texture information, line convergence information, shadows, and depth information. In conclusion, the need for integrating top-down visual expectations to form complete and stable perceptions is made explicit.


Author(s):  
Randolph Blake

: Binocular rivalry epitomizes the essence of a perceptual illusion in that it involves a compelling dissociation of retinal stimulation and visual experience: dissimilar monocular stimuli appear and disappear reciprocally and unpredictably over time, even though retinal images of both stimuli remain unchanged. Thus binocular rivalry is instigated when dissimilar visual stimuli are imaged on corresponding areas of the two eyes. These dissimilarities can arise from differences in form (both simple and complex), color, or direction of motion. This beguiling phenomenon—binocular rivalry—affords the psychologist a potent means for probing visual processing outside of awareness and the neurophysiologist a strategy for studying neural dynamics. Related concepts including bistable perception, interocular suppression, and neural dynamics are explored.


2019 ◽  
Vol 44 (12) ◽  
pp. 1386-1392
Author(s):  
Hongmei Shi ◽  
Yanming Wang ◽  
Xuemei Liu ◽  
Lin Xia ◽  
Yao Chen ◽  
...  

2005 ◽  
Vol 17 (8) ◽  
pp. 1341-1352 ◽  
Author(s):  
Joseph B. Hopfinger ◽  
Anthony J. Ries

Recent studies have generated debate regarding whether reflexive attention mechanisms are triggered in a purely automatic stimulus-driven manner. Behavioral studies have found that a nonpredictive “cue” stimulus will speed manual responses to subsequent targets at the same location, but only if that cue is congruent with actively maintained top-down settings for target detection. When a cue is incongruent with top-down settings, response times are unaffected, and this has been taken as evidence that reflexive attention mechanisms were never engaged in those conditions. However, manual response times may mask effects on earlier stages of processing. Here, we used event-related potentials to investigate the interaction of bottom-up sensory-driven mechanisms and top-down control settings at multiple stages of processing in the brain. Our results dissociate sensory-driven mechanisms that automatically bias early stages of visual processing from later mechanisms that are contingent on top-down control. An early enhancement of target processing in the extrastriate visual cortex (i.e., the P1 component) was triggered by the appearance of a unique bright cue, regardless of top-down settings. The enhancement of visual processing was prolonged, however, when the cue was congruent with top-down settings. Later processing in posterior temporal-parietal regions (i.e., the ipsilateral invalid negativity) was triggered automatically when the cue consisted of the abrupt appearance of a single new object. However, in cases where more than a single object appeared during the cue display, this stage of processing was contingent on top-down control. These findings provide evidence that visual information processing is biased at multiple levels in the brain, and the results distinguish automatically triggered sensory-driven mechanisms from those that are contingent on top-down control settings.


2020 ◽  
Author(s):  
Sanjeev Nara ◽  
Mikel Lizarazu ◽  
Craig G Richter ◽  
Diana C Dima ◽  
Mathieu Bourguignon ◽  
...  

AbstractPredictive processing has been proposed as a fundamental cognitive mechanism to account for how the brain interacts with the external environment via its sensory modalities. The brain processes external information about the content (i.e. “what”) and timing (i.e., “when”) of environmental stimuli to update an internal generative model of the world around it. However, the interaction between “what” and “when” has received very little attention when focusing on vision. In this magnetoencephalography (MEG) study we investigate how processing of feature specific information (i.e. “what”) is affected by temporal predictability (i.e. “when”). In line with previous findings, we observed a suppression of evoked neural responses in the visual cortex for predictable stimuli. Interestingly, we observed that temporal uncertainty enhances this expectation suppression effect. This suggests that in temporally uncertain scenarios the neurocognitive system relies more on internal representations and invests less resources integrating bottom-up information. Indeed, temporal decoding analysis indicated that visual features are encoded for a shorter time period by the neural system when temporal uncertainty is higher. This supports the fact that visual information is maintained active for less time for a stimulus whose time onset is unpredictable compared to when it is predictable. These findings highlight the higher reliance of the visual system on the internal expectations when the temporal dynamics of the external environment are less predictable.


2018 ◽  
Vol 29 (4) ◽  
pp. 496-503 ◽  
Author(s):  
Erika H. Siegel ◽  
Jolie B. Wormwood ◽  
Karen S. Quigley ◽  
Lisa Feldman Barrett

Affective realism, the phenomenon whereby affect is integrated into an individual’s experience of the world, is a normal consequence of how the brain processes sensory information from the external world in the context of sensations from the body. In the present investigation, we provided compelling empirical evidence that affective realism involves changes in visual perception (i.e., affect changes how participants see neutral stimuli). In two studies, we used an interocular suppression technique, continuous flash suppression, to present affective images outside of participants’ conscious awareness. We demonstrated that seen neutral faces are perceived as more smiling when paired with unseen affectively positive stimuli. Study 2 also demonstrated that seen neutral faces are perceived as more scowling when paired with unseen affectively negative stimuli. These findings have implications for real-world situations and challenge beliefs that affect is a distinct psychological phenomenon that can be separated from cognition and perception.


Sign in / Sign up

Export Citation Format

Share Document