The influence of dopamine-related genes on perceptual stability

2013 ◽  
Vol 38 (9) ◽  
pp. 3378-3383 ◽  
Author(s):  
Katharina Schmack ◽  
Maria Sekutowicz ◽  
Hannes Rössler ◽  
Eva J. Brandl ◽  
Daniel J. Müller ◽  
...  
Keyword(s):  
2011 ◽  
Vol 366 (1564) ◽  
pp. 596-610 ◽  
Author(s):  
Benjamin W. Tatler ◽  
Michael F. Land

One of the paradoxes of vision is that the world as it appears to us and the image on the retina at any moment are not much like each other. The visual world seems to be extensive and continuous across time. However, the manner in which we sample the visual environment is neither extensive nor continuous. How does the brain reconcile these differences? Here, we consider existing evidence from both static and dynamic viewing paradigms together with the logical requirements of any representational scheme that would be able to support active behaviour. While static scene viewing paradigms favour extensive, but perhaps abstracted, memory representations, dynamic settings suggest sparser and task-selective representation. We suggest that in dynamic settings where movement within extended environments is required to complete a task, the combination of visual input, egocentric and allocentric representations work together to allow efficient behaviour. The egocentric model serves as a coding scheme in which actions can be planned, but also offers a potential means of providing the perceptual stability that we experience.


2011 ◽  
Vol 106 (4) ◽  
pp. 1862-1874 ◽  
Author(s):  
Jan Churan ◽  
Daniel Guitton ◽  
Christopher C. Pack

Our perception of the positions of objects in our surroundings is surprisingly unaffected by movements of the eyes, head, and body. This suggests that the brain has a mechanism for maintaining perceptual stability, based either on the spatial relationships among visible objects or internal copies of its own motor commands. Strong evidence for the latter mechanism comes from the remapping of visual receptive fields that occurs around the time of a saccade. Remapping occurs when a single neuron responds to visual stimuli placed presaccadically in the spatial location that will be occupied by its receptive field after the completion of a saccade. Although evidence for remapping has been found in many brain areas, relatively little is known about how it interacts with sensory context. This interaction is important for understanding perceptual stability more generally, as the brain may rely on extraretinal signals or visual signals to different degrees in different contexts. Here, we have studied the interaction between visual stimulation and remapping by recording from single neurons in the superior colliculus of the macaque monkey, using several different visual stimulus conditions. We find that remapping responses are highly sensitive to low-level visual signals, with the overall luminance of the visual background exerting a particularly powerful influence. Specifically, although remapping was fairly common in complete darkness, such responses were usually decreased or abolished in the presence of modest background illumination. Thus the brain might make use of a strategy that emphasizes visual landmarks over extraretinal signals whenever the former are available.


2010 ◽  
Vol 10 (7) ◽  
pp. 518-518
Author(s):  
F. Ostendorf ◽  
J. Kilias ◽  
C. Ploner

2016 ◽  
Vol 4 (2) ◽  
pp. 187-206 ◽  
Author(s):  
Trevor B. Penney ◽  
Xiaoqin Cheng ◽  
Yan Ling Leow ◽  
Audrey Wei Ying Bay ◽  
Esther Wu ◽  
...  

A transient suppression of visual perception during saccades ensures perceptual stability. In two experiments, we examined whether saccades affect time perception of visual and auditory stimuli in the seconds range. Specifically, participants completed a duration reproduction task in which they memorized the duration of a 6 s timing signal during the training phase and later reproduced that duration during the test phase. Four experimental conditions differed in saccade requirements and the presence or absence of a secondary discrimination task during the test phase. For both visual and auditory timing signals, participants reproduced longer durations when the secondary discrimination task required saccades to be made (i.e., overt attention shift) during reproduction as compared to when the discrimination task merely required fixation at screen center. Moreover, greater total saccade duration in a trial resulted in greater time distortion. However, in the visual modality, requiring participants to covertly shift attention (i.e., no saccade) to complete the discrimination task increased reproduced duration as much as making a saccade, whereas in the auditory modality making a saccade increased reproduced duration more than making a covert attention shift. In addition, we examined microsaccades in the conditions that did not require full saccades for both the visual and auditory experiments. Greater total microsaccade duration in a trial resulted in greater time distortion in both modalities. Taken together, the experiments suggest that saccades and microsaccades affect seconds range visual and auditory interval timing via attention and saccadic suppression mechanisms.


Author(s):  
Nicholas J. Wade ◽  
Benjamin W. Tatler

Author(s):  
Kathleen E. Cullen

As we go about our everyday activities, our brain computes accurate estimates of both our motion relative to the world, and of our orientation relative to gravity. Essential to this computation is the information provided by the vestibular system; it detects the rotational velocity and linear acceleration of our heads relative to space, making a fundamental contribution to our perception of self-motion and spatial orientation. Additionally, in everyday life, our perception of self-motion depends on the integration of both vestibular and nonvestibular cues, including visual and proprioceptive information. Furthermore, the integration of motor-related information is also required for perceptual stability, so that the brain can distinguish whether the experienced sensory inflow was a result of active self-motion through the world or if instead self-motion that was externally generated. To date, understanding how the brain encodes and integrates sensory cues with motor signals for the perception of self-motion during natural behaviors remains a major goal in neuroscience. Recent experiments have (i) provided new insights into the neural code used to represent sensory information in vestibular pathways, (ii) established that vestibular pathways are inherently multimodal at the earliest stages of processing, and (iii) revealed that self-motion information processing is adjusted to meet the needs of specific tasks. Our current level of understanding of how the brain integrates sensory information and motor-related signals to encode self-motion and ensure perceptual stability during everyday activities is reviewed.


i-Perception ◽  
2020 ◽  
Vol 11 (3) ◽  
pp. 204166952093732
Author(s):  
Masahiko Terao ◽  
Shin’ya Nishida

Many studies have investigated various effects of smooth pursuit on visual motion processing, especially the effects related to the additional retinal shifts produced by eye movement. In this article, we show that the perception of apparent motion during smooth pursuit is determined by the interelement proximity in retinal coordinates and also by the proximity in objective world coordinates. In Experiment 1, we investigated the perceived direction of the two-frame apparent motion of a square-wave grating with various displacement sizes under fixation and pursuit viewing conditions. The retinal and objective displacements between the two frames agreed with each other under the fixation condition. However, the displacements differed by 180 degrees in terms of phase shift, under the pursuit condition. The proportions of the reported motion direction between the two viewing conditions did not coincide when they were plotted as a function of either the retinal displacement or of the objective displacement; however, they did coincide when plotted as a function of a mixture of the two. The result from Experiment 2 showed that the perceived jump size of the apparent motion was also dependent on both retinal and objective displacements. Our findings suggest that the detection of the apparent motion during smooth pursuit considers the retinal proximity and also the objective proximity. This mechanism may assist with the selection of a motion path that is more likely to occur in the real world and, therefore, be useful for ensuring perceptual stability during smooth pursuit.


PLoS ONE ◽  
2016 ◽  
Vol 11 (8) ◽  
pp. e0160772 ◽  
Author(s):  
Veith A. Weilnhammer ◽  
Philipp Sterzer ◽  
Guido Hesselmann

2009 ◽  
Vol 101 (1) ◽  
pp. 141-149 ◽  
Author(s):  
Mohsen Jamali ◽  
Soroush G. Sadeghi ◽  
Kathleen E. Cullen

The distinction between sensory inputs that are a consequence of our own actions from those that result from changes in the external world is essential for perceptual stability and accurate motor control. In this study, we investigated whether linear translations are encoded similarly during active and passive translations by the otolith system. Vestibular nerve afferents innervating the saccule or utricle were recorded in alert macaques. Single unit responses were compared during passive whole body, passive head-on-body, and active head-on-body translations (vertical, fore-aft, or lateral) to assess the relative influence of neck proprioceptive and efference copy-related signals on translational coding. The response dynamics of utricular and saccular afferents were comparable and similarly encoded head translation during passive whole body versus head-on-body translations. Furthermore, when monkeys produced active head-on-body translations with comparable dynamics, the responses of both regular and irregular afferents remained comparable to those recorded during passive movements. Our findings refute the proposal that neck proprioceptive and/or efference copy inputs coded by the efferent system function to modulate the responses of the otolith afferents during active movements. We conclude that the vestibular periphery provides faithful information about linear movements of the head in the space coordinates, regardless of whether they are self- or externally generated.


Sign in / Sign up

Export Citation Format

Share Document