scholarly journals How predictive remapping in LIP (but not FEF) might explain the illusion of perceptual stability

2018 ◽  
Vol 18 (10) ◽  
pp. 1368
Author(s):  
James Bisley
2011 ◽  
Vol 366 (1564) ◽  
pp. 596-610 ◽  
Author(s):  
Benjamin W. Tatler ◽  
Michael F. Land

One of the paradoxes of vision is that the world as it appears to us and the image on the retina at any moment are not much like each other. The visual world seems to be extensive and continuous across time. However, the manner in which we sample the visual environment is neither extensive nor continuous. How does the brain reconcile these differences? Here, we consider existing evidence from both static and dynamic viewing paradigms together with the logical requirements of any representational scheme that would be able to support active behaviour. While static scene viewing paradigms favour extensive, but perhaps abstracted, memory representations, dynamic settings suggest sparser and task-selective representation. We suggest that in dynamic settings where movement within extended environments is required to complete a task, the combination of visual input, egocentric and allocentric representations work together to allow efficient behaviour. The egocentric model serves as a coding scheme in which actions can be planned, but also offers a potential means of providing the perceptual stability that we experience.


2011 ◽  
Vol 106 (4) ◽  
pp. 1862-1874 ◽  
Author(s):  
Jan Churan ◽  
Daniel Guitton ◽  
Christopher C. Pack

Our perception of the positions of objects in our surroundings is surprisingly unaffected by movements of the eyes, head, and body. This suggests that the brain has a mechanism for maintaining perceptual stability, based either on the spatial relationships among visible objects or internal copies of its own motor commands. Strong evidence for the latter mechanism comes from the remapping of visual receptive fields that occurs around the time of a saccade. Remapping occurs when a single neuron responds to visual stimuli placed presaccadically in the spatial location that will be occupied by its receptive field after the completion of a saccade. Although evidence for remapping has been found in many brain areas, relatively little is known about how it interacts with sensory context. This interaction is important for understanding perceptual stability more generally, as the brain may rely on extraretinal signals or visual signals to different degrees in different contexts. Here, we have studied the interaction between visual stimulation and remapping by recording from single neurons in the superior colliculus of the macaque monkey, using several different visual stimulus conditions. We find that remapping responses are highly sensitive to low-level visual signals, with the overall luminance of the visual background exerting a particularly powerful influence. Specifically, although remapping was fairly common in complete darkness, such responses were usually decreased or abolished in the presence of modest background illumination. Thus the brain might make use of a strategy that emphasizes visual landmarks over extraretinal signals whenever the former are available.


2018 ◽  
Author(s):  
Tao He ◽  
Matthias Fritsche ◽  
Floris P. de Lange

AbstractVisual stability is thought to be mediated by predictive remapping of the relevant object information from its current, pre-saccadic locations to its future, post-saccadic location on the retina. However, it is heavily debated whether and what feature information is predictively remapped during the pre-saccadic interval. Using an orientation adaptation paradigm, we investigated whether predictive remapping occurs for stimulus features and whether adaptation itself is remapped. We found strong evidence for predictive remapping of a stimulus presented shortly before saccade onset, but no remapping of adaptation. Furthermore, we establish that predictive remapping also occurs for stimuli that are not saccade targets, pointing toward a ‘forward remapping’ process operating across the whole visual field. Together, our findings suggest that predictive feature remapping of object information plays an important role in mediating visual stability.


2010 ◽  
Vol 10 (7) ◽  
pp. 518-518
Author(s):  
F. Ostendorf ◽  
J. Kilias ◽  
C. Ploner

2016 ◽  
Vol 4 (2) ◽  
pp. 187-206 ◽  
Author(s):  
Trevor B. Penney ◽  
Xiaoqin Cheng ◽  
Yan Ling Leow ◽  
Audrey Wei Ying Bay ◽  
Esther Wu ◽  
...  

A transient suppression of visual perception during saccades ensures perceptual stability. In two experiments, we examined whether saccades affect time perception of visual and auditory stimuli in the seconds range. Specifically, participants completed a duration reproduction task in which they memorized the duration of a 6 s timing signal during the training phase and later reproduced that duration during the test phase. Four experimental conditions differed in saccade requirements and the presence or absence of a secondary discrimination task during the test phase. For both visual and auditory timing signals, participants reproduced longer durations when the secondary discrimination task required saccades to be made (i.e., overt attention shift) during reproduction as compared to when the discrimination task merely required fixation at screen center. Moreover, greater total saccade duration in a trial resulted in greater time distortion. However, in the visual modality, requiring participants to covertly shift attention (i.e., no saccade) to complete the discrimination task increased reproduced duration as much as making a saccade, whereas in the auditory modality making a saccade increased reproduced duration more than making a covert attention shift. In addition, we examined microsaccades in the conditions that did not require full saccades for both the visual and auditory experiments. Greater total microsaccade duration in a trial resulted in greater time distortion in both modalities. Taken together, the experiments suggest that saccades and microsaccades affect seconds range visual and auditory interval timing via attention and saccadic suppression mechanisms.


Author(s):  
Nicholas J. Wade ◽  
Benjamin W. Tatler

2013 ◽  
Vol 38 (9) ◽  
pp. 3378-3383 ◽  
Author(s):  
Katharina Schmack ◽  
Maria Sekutowicz ◽  
Hannes Rössler ◽  
Eva J. Brandl ◽  
Daniel J. Müller ◽  
...  
Keyword(s):  

Author(s):  
Kathleen E. Cullen

As we go about our everyday activities, our brain computes accurate estimates of both our motion relative to the world, and of our orientation relative to gravity. Essential to this computation is the information provided by the vestibular system; it detects the rotational velocity and linear acceleration of our heads relative to space, making a fundamental contribution to our perception of self-motion and spatial orientation. Additionally, in everyday life, our perception of self-motion depends on the integration of both vestibular and nonvestibular cues, including visual and proprioceptive information. Furthermore, the integration of motor-related information is also required for perceptual stability, so that the brain can distinguish whether the experienced sensory inflow was a result of active self-motion through the world or if instead self-motion that was externally generated. To date, understanding how the brain encodes and integrates sensory cues with motor signals for the perception of self-motion during natural behaviors remains a major goal in neuroscience. Recent experiments have (i) provided new insights into the neural code used to represent sensory information in vestibular pathways, (ii) established that vestibular pathways are inherently multimodal at the earliest stages of processing, and (iii) revealed that self-motion information processing is adjusted to meet the needs of specific tasks. Our current level of understanding of how the brain integrates sensory information and motor-related signals to encode self-motion and ensure perceptual stability during everyday activities is reviewed.


Sign in / Sign up

Export Citation Format

Share Document