The perception of object motion during smooth pursuit eye movements: Adjacency is not a factor contributing to the filehne illusion

1988 ◽  
Vol 28 (4) ◽  
pp. 497-502 ◽  
Author(s):  
Bernd De Graaf ◽  
Alexander H. Wertheim
2005 ◽  
Vol 93 (4) ◽  
pp. 2279-2293 ◽  
Author(s):  
Julian M. Wallace ◽  
Leland S. Stone ◽  
Guillaume S. Masson

Pursuing an object with smooth eye movements requires an accurate estimate of its two-dimensional (2D) trajectory. This 2D motion computation requires that different local motion measurements are extracted and combined to recover the global object-motion direction and speed. Several combination rules have been proposed such as vector averaging (VA), intersection of constraints (IOC), or 2D feature tracking (2DFT). To examine this computation, we investigated the time course of smooth pursuit eye movements driven by simple objects of different shapes. For type II diamond (where the direction of true object motion is dramatically different from the vector average of the 1-dimensional edge motions, i.e., VA ≠ IOC = 2DFT), the ocular tracking is initiated in the vector average direction. Over a period of less than 300 ms, the eye-tracking direction converges on the true object motion. The reduction of the tracking error starts before the closing of the oculomotor loop. For type I diamonds (where the direction of true object motion is identical to the vector average direction, i.e., VA = IOC = 2DFT), there is no such bias. We quantified this effect by calculating the direction error between responses to types I and II and measuring its maximum value and time constant. At low contrast and high speeds, the initial bias in tracking direction is larger and takes longer to converge onto the actual object-motion direction. This effect is attenuated with the introduction of more 2D information to the extent that it was totally obliterated with a texture-filled type II diamond. These results suggest a flexible 2D computation for motion integration, which combines all available one-dimensional (edge) and 2D (feature) motion information to refine the estimate of object-motion direction over time.


2007 ◽  
Vol 98 (3) ◽  
pp. 1355-1363 ◽  
Author(s):  
Miriam Spering ◽  
Karl R. Gegenfurtner

The analysis of visual motion serves many different functions ranging from object motion perception to the control of self-motion. The perception of visual motion and the oculomotor tracking of a moving object are known to be closely related and are assumed to be controlled by shared brain areas. We compared perceived velocity and the velocity of smooth pursuit eye movements in human observers in a paradigm that required the segmentation of target object motion from context motion. In each trial, a pursuit target and a visual context were independently perturbed simultaneously to briefly increase or decrease in speed. Observers had to accurately track the target and estimate target speed during the perturbation interval. Here we show that the same motion signals are processed in fundamentally different ways for perception and steady-state smooth pursuit eye movements. For the computation of perceived velocity, motion of the context was subtracted from target motion (motion contrast), whereas pursuit velocity was determined by the motion average (motion assimilation). We conclude that the human motion system uses these computations to optimally accomplish different functions: image segmentation for object motion perception and velocity estimation for the control of smooth pursuit eye movements.


2011 ◽  
Vol 70 ◽  
pp. 352-352 ◽  
Author(s):  
K Strand Brodd ◽  
K Rosander ◽  
H Grönqvist ◽  
G Holmström ◽  
B Strömberg ◽  
...  

1983 ◽  
Vol 79 (2-3) ◽  
pp. 190-192 ◽  
Author(s):  
G. Tedeschi ◽  
P. R. M. Bittencourt ◽  
A. T. Smith ◽  
A. Richens

1975 ◽  
Vol 44 (2) ◽  
pp. 111-115 ◽  
Author(s):  
Philip S. Holzman ◽  
Deborah L. Levy ◽  
Eberhard H. Uhlenhuth ◽  
Leonard R. Proctor ◽  
Daniel X. Freedman

Sign in / Sign up

Export Citation Format

Share Document