The Effect of Temporal Phase on the Perception of Apparent Motion

Perception ◽  
1996 ◽  
Vol 25 (1_suppl) ◽  
pp. 68-68
Author(s):  
H S Hock ◽  
K Kogan ◽  
N Lodes

In classical apparent motion, a spot of light is presented in alternation such that the waveforms describing the varying luminance at each of two locations are 180° out of phase. However, when the luminance variation at each location is approximately sinusoidal, and the perceiver's task is to discriminate motion direction, the optimum temporal phase is 90° (van Santen and Sperling, 1984 Journal of the Optical Society of America A1 451 – 473). The results reported in this study suggest that the optimality of the 90° temporal phase may be specific to the direction-discrimination task. Our experiments were based on a new procedure for measuring classical apparent motion thresholds (Hock, Kogan, and Espinoza, 1996, paper presented at ARVO). Two horizontally displaced dots are presented simultaneously against a darker background. The luminance ( L1) of one dot is always greater than that of the other ( L2), and the luminance values for the dots are exchanged on successive frames. Whether motion or stationarity is perceived depends on the background-relative luminance contrast (BRLC): ( L1- L2) divided by the difference between the average [( L1+ L2)/2] and background luminance. We found in the current study that motion thresholds depend on the temporal phase of the luminance variation at each location (rather than temporal asynchrony); the greater the phase difference (from 41° to 180°) the less the BRLC required for motion perception. At suprathreshold BRLC values, the perceived speed of apparent motion decreases with increased differences in temporal phase. The results are discussed in terms of Reichardt-type motion detection models.

Author(s):  
Filippo Ghin ◽  
Louise O’Hare ◽  
Andrea Pavan

AbstractThere is evidence that high-frequency transcranial random noise stimulation (hf-tRNS) is effective in improving behavioural performance in several visual tasks. However, so far there has been limited research into the spatial and temporal characteristics of hf-tRNS-induced facilitatory effects. In the present study, electroencephalogram (EEG) was used to investigate the spatial and temporal dynamics of cortical activity modulated by offline hf-tRNS on performance on a motion direction discrimination task. We used EEG to measure the amplitude of motion-related VEPs over the parieto-occipital cortex, as well as oscillatory power spectral density (PSD) at rest. A time–frequency decomposition analysis was also performed to investigate the shift in event-related spectral perturbation (ERSP) in response to the motion stimuli between the pre- and post-stimulation period. The results showed that the accuracy of the motion direction discrimination task was not modulated by offline hf-tRNS. Although the motion task was able to elicit motion-dependent VEP components (P1, N2, and P2), none of them showed any significant change between pre- and post-stimulation. We also found a time-dependent increase of the PSD in alpha and beta bands regardless of the stimulation protocol. Finally, time–frequency analysis showed a modulation of ERSP power in the hf-tRNS condition for gamma activity when compared to pre-stimulation periods and Sham stimulation. Overall, these results show that offline hf-tRNS may induce moderate aftereffects in brain oscillatory activity.


2004 ◽  
Vol 16 (1) ◽  
pp. 1-38 ◽  
Author(s):  
Rajesh P. N. Rao

A large number of human psychophysical results have been successfully explained in recent years using Bayesian models. However, the neural implementation of such models remains largely unclear. In this article, we show that a network architecture commonly used to model the cerebral cortex can implement Bayesian inference for an arbitrary hidden Markov model. We illustrate the approach using an orientation discrimination task and a visual motion detection task. In the case of orientation discrimination, we show that the model network can infer the posterior distribution over orientations and correctly estimate stimulus orientation in the presence of significant noise. In the case of motion detection, we show that the resulting model network exhibits direction selectivity and correctly computes the posterior probabilities over motion direction and position. When used to solve the well-known random dots motion discrimination task, the model generates responses that mimic the activities of evidence-accumulating neurons in cortical areas LIP and FEF. The framework we introduce posits a new interpretation of cortical activities in terms of log posterior probabilities of stimuli occurring in the natural world.


2017 ◽  
Author(s):  
T. Scott Murdison ◽  
Dominic Standage ◽  
Philippe Lefèvre ◽  
Gunnar Blohm

AbstractRecent psychophysical and modeling studies have revealed that sensorimotor reference frame transformations (RFTs) add variability to motor output by decreasing the fidelity of sensory signals. How RFT stochasticity affects the sensory input underlying perceptual decisions, if at all, is unknown. To investigate this, we asked participants to perform a simple two-alternative motion direction discrimination task under varying conditions of head roll and/or stimulus rotation while responding either with a saccade or button press, allowing us to attribute behavioral effects to eye-, head- and shoulder-centered reference frames. We observed a rotation-induced, increase in reaction time and decrease in accuracy, indicating a degradation of motion evidence commensurate with a decrease in motion strength. Inter-participant differences in performance were best explained by a continuum of eye-head-shoulder representations of accumulated decision evidence, with eye- and shoulder-centered preferences during saccades and button presses, respectively. We argue that perceptual decision making and stochastic RFTs are inseparable, consistent with electrophysiological recordings in neural areas thought to be encoding sensorimotor signals for perceptual decisions. Furthermore, transformational stochasticity appears to be a generalized phenomenon, applicable throughout the perceptual and motor systems. We show for the first time that, by simply rolling one’s head, perceptual decision making is impaired in a way that is captured by stochastic RFTs.Significance statementWhen exploring our environment, we typically maintain upright head orientations, often even despite increased energy expenditure. One possible explanation for this apparently suboptimal behavior might come from the finding that sensorimotor transformations, required for generating geometrically-correct behavior, add signal- dependent variability (stochasticity) to perception and action. Here, we explore the functional interaction of stochastic transformations and perceptual decisions by rolling the head and/or stimulus during a motion direction discrimination task. We find that, during visuomotor rotations, perceptual decisions are significantly impaired in both speed and accuracy in a way that is captured by stochastic transformations. Thus, our findings suggest that keeping one’s head aligned with gravity is in fact ideal for making perceptual judgments about our environment.


i-Perception ◽  
2021 ◽  
Vol 12 (2) ◽  
pp. 204166952110046
Author(s):  
Scinob Kuroki ◽  
Shin’ya Nishida

Motion detection is a fundamental sensory function for multiple modalities, including touch, but the mechanisms underlying tactile motion detection are not well understood. While previous findings supported the existence of high-level feature tracking, it remains unclear whether there also exist low-level motion sensing that directly detects a local spatio-temporal correlation in the skin-stimulation pattern. To elucidate this mechanism, we presented, on braille displays, tactile random-dot kinematograms, similar to those widely used in visual motion research, which enables us to independently manipulate feature trackability and various parameters of local motion. We found that a human observer is able to detect the direction of difficult-to-track tactile motions presented to the fingers and palms. In addition, the direction-discrimination performance was better when the stimuli were presented along the fingers than when presented across the fingers. These results indicate that low-level motion sensing, in addition to high-level tracking, contribute to tactile motion perception.


2003 ◽  
Vol 96 (3_suppl) ◽  
pp. 1265-1280E ◽  
Author(s):  
Elisabetta Gyulai

The present study explored the effects of luminance, color, and distance on apparent motion direction of some repeated stimuli of alternating colors. Two experiments were performed to specify what affects apparent motion direction when the stroboscopic alternative is between proximity and similarity in color. In Exp. 1 the influence of luminance on apparent motion was systematically studied with seven different background luminance values. Analysis pointed out that apparent motion direction depends on the discriminability among the disks, in relation also to the background values. A formula to predict the apparent motion direction from luminance values was proposed. In Exp. 2 the influence of hue on apparent motion was studied. Stimuli with equiluminant disks on different backgrounds were used. Analysis pointed out that the stroboscopic unification was based mainly on proximity and that discriminability between the disks was rather poor, especially when apparent motion occurred also on an equiluminant background, as already pointed out by Ramachandran and Gregory in 1978. A different influence on apparent motion of the red-blue and red-green disks, as opposed to that of the blue-green ones, was also noted.


Sign in / Sign up

Export Citation Format

Share Document