Asymmetric Mislocalization of a Visual Flash Ahead of and behind a Moving Object

Perception ◽  
10.1068/p5415 ◽  
2005 ◽  
Vol 34 (6) ◽  
pp. 687-698 ◽  
Author(s):  
Katsumi Watanabe

When subjects localize a flash relative to another stationary stimulus, the flash appears displaced in the direction of nearby motion signals (position capture; Whitney and Cavanagh, 2000 Nature Neuroscience3 954–959). Our previous study had suggested that the position capture is larger for a flash presented ahead of a moving stimulus than for a flash behind it (Watanabe et al, 2003 Perception32 545–559). In the present study, I investigated the spatial asymmetry of position capture. Experiment 1 demonstrated that asymmetric position capture occurs primarily in a moving-object-centered coordinate. Experiment 2 showed evidence that the asymmetric position capture operates after individuation of single visual objects. Finally, experiment 3 demonstrated that, when attention was reduced with a dual-task procedure, the asymmetric position capture increased. These results suggest that the spatial asymmetry of position capture occurs without attention but the spatial bias can be reduced by attention. Therefore, the underlying mechanism for the asymmetric spatial bias may be different from attentive tracking (Cavanagh, 1992 Science257 1563–1565) and mislocalization during smooth pursuit (Brenner et al, 2001 Vision Research41 2253–2259).

2006 ◽  
Vol 273 (1600) ◽  
pp. 2507-2512 ◽  
Author(s):  
Barrie W Roulston ◽  
Matt W Self ◽  
Semir Zeki

The mechanism of positional localization has recently been debated due to interest in the flash-lag effect, which occurs when a briefly flashed stationary stimulus is perceived to lag behind a spatially aligned moving stimulus. Here we report positional localization observed at motion offsets as well as at onsets. In the ‘flash-lead’ effect, a moving object is perceived to be behind a spatially concurrent stationary flash before the two disappear. With ‘reverse-repmo’, subjects mis-localize the final position of a moving bar in the direction opposite to the trajectory of motion. Finally, we demonstrate that simultaneous onset and offset effects lead to a perceived compression of visual space. By characterizing illusory effects observed at motion offsets as well as at onsets, we provide evidence that the perceived position of a moving object is the result of an averaging process over a short time period, weighted towards the most recent positions. Our account explains a variety of motion illusions, including the compression of moving shapes when viewed through apertures.


1965 ◽  
Vol 21 (1) ◽  
pp. 43-51 ◽  
Author(s):  
Grant D. Miller ◽  
Duane A. Anderson ◽  
Ernst Simonson

The relationship between stimulus velocity and the critical-flicker-fusion frequency (CFF) of an intermittent visual stimulus was investigated by modulating the sweep-speed and intensity of an oscilloscope beam. When Ss fixated upon a stationary point, CFF showed an approximately linear increase as a function of velocity. Velocity did not, however, influence CFF when S fixated on the moving stimulus. The multiple correlation (.68) between CFF determinations obtained with a stationary stimulus vs those obtained with several different velocities implies that the same mechanisms which determined CFF under the former condition were also operative in the latter. The trend of the bivariate correlations between the average CFF values for isolated pairs of experimental conditions suggests that an additional factor, possibly spatial acuity, may have become progressively dominant as velocities exceeded 1.08°/sec.


2007 ◽  
Vol 97 (2) ◽  
pp. 1353-1367 ◽  
Author(s):  
Miriam Spering ◽  
Karl R. Gegenfurtner

Segregating a moving object from its visual context is particularly relevant for the control of smooth-pursuit eye movements. We examined the interaction between a moving object and a stationary or moving visual context to determine the role of the context motion signal in driving pursuit. Eye movements were recorded from human observers to a medium-contrast Gaussian dot that moved horizontally at constant velocity. A peripheral context consisted of two vertically oriented sinusoidal gratings, one above and one below the stimulus trajectory, that were either stationary or drifted into the same or opposite direction as that of the target at different velocities. We found that a stationary context impaired pursuit acceleration and velocity and prolonged pursuit latency. A drifting context enhanced pursuit performance, irrespective of its motion direction. This effect was modulated by context contrast and orientation. When a context was briefly perturbed to move faster or slower eye velocity changed accordingly, but only when the context was drifting along with the target. Perturbing a context into the direction orthogonal to target motion evoked a deviation of the eye opposite to the perturbation direction. We therefore provide evidence for the use of absolute and relative motion cues, or motion assimilation and motion contrast, for the control of smooth-pursuit eye movements.


2021 ◽  
Author(s):  
Audrey Morrow ◽  
Jason Samaha

AbstractTheories of perception based on discrete sampling posit that visual consciousness is reconstructed based on snapshot-like perceptual moments, as opposed to being updated continuously. According to a model proposed by Schneider (2018), discrete sampling can explain both the flash-lag and the Fröhlich illusion, whereby a lag in the conscious updating of a moving stimulus alters its perceived spatial location in comparison to a stationary stimulus. The alpha-band frequency, which is associated with phasic modulation of stimulus detection and the temporal resolution of perception, has been proposed to reflect the duration of perceptual moments. The goal of this study was to determine whether a single oscillator (e.g., alpha) is underlying the duration of perceptual moments, which would predict that the point of subjective equality (PSE) in the flash-lag and Fröhlich illusions are positively correlated across individuals. Although our displays induced robust flash-lag and Fröhlich effects, virtually zero correlation was seen between the PSE in the two illusions, indicating that the illusion magnitudes are unrelated across observers. These findings suggest that, if discrete sampling theory is true, these illusory percepts either rely on different oscillatory frequencies or not on oscillations at all. Alternatively, discrete sampling may not be the mechanism underlying these two motion illusions or our methods were ill-suited to test the theory.


2002 ◽  
Vol 87 (4) ◽  
pp. 1772-1780 ◽  
Author(s):  
Sophie de Brouwer ◽  
Marcus Missal ◽  
Graham Barnes ◽  
Philippe Lefèvre

During visual tracking of a moving stimulus, primates orient their visual axis by combining two very different types of eye movements, smooth pursuit and saccades. The purpose of this paper was to investigate quantitatively the catch-up saccades occurring during sustained pursuit. We used a ramp-step-ramp paradigm to evoke catch-up saccades during sustained pursuit. In general, catch-up saccades followed the unexpected steps in position and velocity of the target. We observed catch-up saccades in the same direction as the smooth eye movement (forward saccades) as well as in the opposite direction (reverse saccades). We made a comparison of the main sequences of forward saccades, reverse saccades, and control saccades made to stationary targets. They were all three significantly different from each other and were fully compatible with the hypothesis that the smooth pursuit component is added to the saccadic component during catch-up saccades. A multiple linear regression analysis was performed on the saccadic component to find the parameters determining the amplitude of catch-up saccades. We found that both position error and retinal slip are taken into account in catch-up saccade programming to predict the future trajectory of the moving target. We also demonstrated that the saccadic system needs a minimum period of approximately 90 ms for taking into account changes in target trajectory. Finally, we reported a saturation (above 15°/s) in the contribution of retinal slip to the amplitude of catch-up saccades.


Author(s):  
Agnes Wong

Smooth pursuit consists of conjugate eye movements that allow both eyes to smoothly track a slowly moving object so that its image is kept on the foveae. For example, smooth pursuit eye movements are used when you track a child on a swing. Only animals with foveae make smooth pursuit eye movements. Rabbits, for instance, do not have foveae, and their eyes cannot track a small moving target. However, if a rabbit is placed inside a rotating drum painted on the inside with stripes so that the rabbit sees the entire visual field rotating en bloc, it will track the stripes optokinetically. Humans have both smooth pursuit and optokinetic eye movements, but pursuit predominates. When you track a small, moving object against a detailed stationary background, such as a bird flying against a background of leaves, the optokinetic system will try to hold your gaze on the stationary background, but it is overridden by pursuit. Pursuit works well at speeds up to about 70°/sec, but top athletes may generate pursuit as fast as 130°/sec. Pursuit responds slowly to unexpected changes—it takes about 100 msec to track a target that starts to move suddenly, and this is why we need the faster acting vestibulo-ocular reflex (VOR) to stabilize our eyes when our heads move. However, pursuit can detect patterns of motion and respond to predictable target motion in much less than 100 msec. Pursuit cannot be generated voluntarily without a suitable target. If you try to pursue an imaginary target moving across your visual field, you will make a series of saccades instead of pursuit. However, the target that evokes pursuit does not have to be visual; it may be auditory (e.g., a moving, beeping pager), proprioceptive (e.g., tracking your outstretched finger in the dark), tactile (e.g., an ant crawling on your arm in the dark), or cognitive (e.g., tracking a stroboscopic motion in which a series of light flashes in sequence, even though no actual motion occurs.


Perception ◽  
1993 ◽  
Vol 22 (9) ◽  
pp. 1111-1119 ◽  
Author(s):  
Nicholas J Wade ◽  
Michael T Swanston

Visual motion of a physically stationary stimulus can be induced by the movement of adjacent stimuli. The frequencies of motion reports and the angular separations required to induce motion were determined for a number of stimulus configurations. A stationary stimulus was fixated in the centre of the display and the point at which induced motion was initially reported was measured. In the first experiment either one or two stationary stimuli were presented in the centre of a display and either one or two similar stimuli moved horizontally towards them. The percentage of trials on which motion was induced varied with the display configuration, being greatest with two moving and one stationary stimuli. The angular separations at which motion was reported were about 2 deg for all conditions. In the second experiment the binocular interaction of such induced motion was examined. A single static fixation stimulus was presented binocularly and a range of monocular or dichoptic conditions was examined: a single moving stimulus to one eye, two moving stimuli to one eye, or two moving stimuli dichoptically. Induced motion was reported on about 90% of the trials for the monocular and dichoptic conditions with two moving stimuli. Motion was first induced at similar angular separations by two moving stimuli, whether presented monocularly or dichoptically. Binocular interaction was further examined with a display that induced motion in the stimulus presented to one eye but not in that presented to the other: this resulted in the apparent motion in depth of the binocularly fixated stimulus.


1999 ◽  
Vol 36 (2) ◽  
pp. 158-163 ◽  
Author(s):  
Norbert Kathmann ◽  
Andrea Hochrein ◽  
Ruth Uwer

Sign in / Sign up

Export Citation Format

Share Document