Editorial: Human motion perception, eye movements, and orientation in visual space

2000 ◽  
Vol 59 (2) ◽  
pp. 85-88 ◽  
Author(s):  
Rudolf Groner ◽  
Marina T. Groner ◽  
Kazuo Koga
2000 ◽  
Vol 59 (2) ◽  
pp. 108-114 ◽  
Author(s):  
Kazuo Koga

Evidence is presented that eye movements have a strong modulation effect on perceived motion of an object in an induced motion situation. It was investigated whether pursuit eye movements affect motion perception, particularly target velocity perception, under the following stimulus conditions: (1) laterally moving objects on the computer display, (2) recurrent simple target motion and, (3) a unilaterally scrolling grid. The observers' eye movements were recorded and, at the same time, their responses with respect to their velocity perception were registered and analyzed in synchronization with the eye movement data. In most cases, when pursuit eye movements were synchronized with the movement of the target, the velocity of the target was judged to be slow or motionless. An explanation of the results is presented which is based on two sources of motion information: (1) A displacement detector in terms of retinal coordinates, and (2) a proprioceptive sensing unit associated with the eye movements. The veridicality of the judgments of the velocity of the object motion was determined by the complexity of the processes for integrating the signals from the two channels.


2015 ◽  
Vol 35 (22) ◽  
pp. 8515-8530 ◽  
Author(s):  
Trishna Mukherjee ◽  
Matthew Battifarano ◽  
Claudio Simoncini ◽  
Leslie C. Osborne

2007 ◽  
Vol 98 (3) ◽  
pp. 1355-1363 ◽  
Author(s):  
Miriam Spering ◽  
Karl R. Gegenfurtner

The analysis of visual motion serves many different functions ranging from object motion perception to the control of self-motion. The perception of visual motion and the oculomotor tracking of a moving object are known to be closely related and are assumed to be controlled by shared brain areas. We compared perceived velocity and the velocity of smooth pursuit eye movements in human observers in a paradigm that required the segmentation of target object motion from context motion. In each trial, a pursuit target and a visual context were independently perturbed simultaneously to briefly increase or decrease in speed. Observers had to accurately track the target and estimate target speed during the perturbation interval. Here we show that the same motion signals are processed in fundamentally different ways for perception and steady-state smooth pursuit eye movements. For the computation of perceived velocity, motion of the context was subtracted from target motion (motion contrast), whereas pursuit velocity was determined by the motion average (motion assimilation). We conclude that the human motion system uses these computations to optimally accomplish different functions: image segmentation for object motion perception and velocity estimation for the control of smooth pursuit eye movements.


2019 ◽  
Vol 5 (1) ◽  
pp. 247-268 ◽  
Author(s):  
Peter Thier ◽  
Akshay Markanday

The cerebellar cortex is a crystal-like structure consisting of an almost endless repetition of a canonical microcircuit that applies the same computational principle to different inputs. The output of this transformation is broadcasted to extracerebellar structures by way of the deep cerebellar nuclei. Visually guided eye movements are accommodated by different parts of the cerebellum. This review primarily discusses the role of the oculomotor part of the vermal cerebellum [the oculomotor vermis (OMV)] in the control of visually guided saccades and smooth-pursuit eye movements. Both types of eye movements require the mapping of retinal information onto motor vectors, a transformation that is optimized by the OMV, considering information on past performance. Unlike the role of the OMV in the guidance of eye movements, the contribution of the adjoining vermal cortex to visual motion perception is nonmotor and involves a cerebellar influence on information processing in the cerebral cortex.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Sichao Yang ◽  
Johannes Bill ◽  
Jan Drugowitsch ◽  
Samuel J. Gershman

AbstractMotion relations in visual scenes carry an abundance of behaviorally relevant information, but little is known about how humans identify the structure underlying a scene’s motion in the first place. We studied the computations governing human motion structure identification in two psychophysics experiments and found that perception of motion relations showed hallmarks of Bayesian structural inference. At the heart of our research lies a tractable task design that enabled us to reveal the signatures of probabilistic reasoning about latent structure. We found that a choice model based on the task’s Bayesian ideal observer accurately matched many facets of human structural inference, including task performance, perceptual error patterns, single-trial responses, participant-specific differences, and subjective decision confidence—especially, when motion scenes were ambiguous and when object motion was hierarchically nested within other moving reference frames. Our work can guide future neuroscience experiments to reveal the neural mechanisms underlying higher-level visual motion perception.


2005 ◽  
Vol 167 (4) ◽  
pp. 504-525 ◽  
Author(s):  
Igor Riečanský ◽  
Alexander Thiele ◽  
Claudia Distler ◽  
Klaus-Peter Hoffmann

Sign in / Sign up

Export Citation Format

Share Document