stimulus motion
Recently Published Documents


TOTAL DOCUMENTS

98
(FIVE YEARS 6)

H-INDEX

30
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Gi-Yeul Bae ◽  
Steven J Luck

Computational models for motion perception suggest a possibility that read-out of motion signal can yield the perception of opposite direction of the true stimulus motion direction. However, this possibility was not obvious in a standard 2AFC motion discrimination (e.g., leftward vs.rightward). By allowing the motion direction to vary over 360° in typical random-dot kinematograms (RDKs) displays, and by asking observers to estimate the exact direction of motion, we were able to detect the presence of opposite-direction motion perception in RDKs.This opposite-direction motion perception was replicable across multiple display types andfeedback conditions, and participants had greater confidence in their opposite-direction responses than in true guess responses. When we fed RDKs into a computational model of motion processing, we found that the model estimated substantial motion activity in the direction opposite to the coherent stimulus direction, even though no such motion was objectively present in the stimuli, suggesting that the opposite-direction motion perception may be a consequenceof the properties of motion-selective neurons in visual cortex. Together, these results demonstrate that the perception of opposite-direction motion in RDKs is consistent with the known properties of the visual system.


2020 ◽  
Author(s):  
Molly Anne Bowdring ◽  
Michael Sayette ◽  
Jeffrey M. Girard ◽  
William C. Woods

Physical attractiveness plays a central role in psychosocial experiences. One of the top research priorities has been to identify factors affecting perceptions of physical attractiveness (PPA). Recent work suggests PPA derives from different sources (e.g., target, perceiver, stimulus type). Although smiles in particular are believed to enhance PPA, support has been surprisingly limited. This study comprehensively examines the effect of smiles on PPA and, more broadly, evaluates the roles of target, perceiver, and stimulus type in PPA variation. Perceivers (n = 181) rated both static images and 5-sec videos of targets displaying smiling and neutral-expressions. Smiling images were rated as more attractive than neutral-expression images (regardless of stimulus motion format). Interestingly, perceptions of physical attractiveness were based more on the perceiver than on either the target or format in which the target was presented. Results clarify the effect of smiles, and highlight the significant role of the perceiver, in PPA.


2020 ◽  
Author(s):  
Giulia Sedda ◽  
David J. Ostry ◽  
Vittorio Sanguineti ◽  
Silvio P. Sabatini

Proper interpretation of visual information requires capturing the structural regularities in the visual signal and this frequently occurs in conjunction with movement. Perceptual interpretation is complicated both by transient perceptual changes that accompany motor activity, and as found in audition and somatosensation, by more persistent changes that accompany the learning of new movements. Here we asked whether motor learning also results in sustained changes to visual perception. We designed a reaching task in which participants directly controlled the visual information they received, which we term self-operated stimuli. Specifically, they trained to make movements in a number of directions. Directional information was provided by the motion of an intrinsically ambiguous moving stimulus which was directly tied to motion of the hand. We find that movement training improves perception of coherent stimulus motion, and that changes in movement are correlated with the perceptual change. No perceptual changes are observed in passive observers even when they are provided with an explicit strategy to solve perceptual grouping. Comparison of empirical perceptual data with simulations based on a Bayesian generative model of motion perception suggests that movement training promotes the fine-tuning of the internal representation of stimulus geometry. These results emphasize the role of sensorimotor interaction in determining the persistent properties in space and time that define a percept.


2020 ◽  
Vol 8 (2) ◽  
pp. 119-136
Author(s):  
Stefan Weber ◽  
David Weibel ◽  
Fred W. Mast

The velocity of moving stimuli has been linked to their experienced duration. This effect was extended to instances of self-motion, where one’s own movement affects the subjective length of time. However, the experimental evidence for this extension is scarce and the effect of self-motion has not been investigated using a reproduction paradigm. Therefore, we designed a virtual reality scenario that controls for attention and eliminates the confounding effect of velocity and acceleration. The scenario consisted of a virtual road on which participants (n = 26) moved along in a car for six different durations and with six different velocities. We measured the subjective duration of the movement with reproduction and direct numerical estimation. We also assessed levels of presence in the virtual world. Our results show that higher velocity was connected to longer subjective time for both forms of measurement. However, the effect showed deviations from linearity. Presence was not associated with subjective time and did not improve performance on the task. We interpreted the effect of velocity as corroborating previous work using stimulus motion, which showed the same positive association between velocity of movement and subjective time. The absence of an effect of presence was explained in terms of a lacking dependency of time on characteristics of the virtual environment. We suggest applying our findings to the design of virtual experiences intended for inducing time loss.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Martin Giesel ◽  
Alexandra Yakovleva ◽  
Marina Bloj ◽  
Alex R. Wade ◽  
Anthony M. Norcia ◽  
...  

AbstractWhen we track an object moving in depth, our eyes rotate in opposite directions. This type of “disjunctive” eye movement is called horizontal vergence. The sensory control signals for vergence arise from multiple visual cues, two of which, changing binocular disparity (CD) and inter-ocular velocity differences (IOVD), are specifically binocular. While it is well known that the CD cue triggers horizontal vergence eye movements, the role of the IOVD cue has only recently been explored. To better understand the relative contribution of CD and IOVD cues in driving horizontal vergence, we recorded vergence eye movements from ten observers in response to four types of stimuli that isolated or combined the two cues to motion-in-depth, using stimulus conditions and CD/IOVD stimuli typical of behavioural motion-in-depth experiments. An analysis of the slopes of the vergence traces and the consistency of the directions of vergence and stimulus movements showed that under our conditions IOVD cues provided very little input to vergence mechanisms. The eye movements that did occur coinciding with the presentation of IOVD stimuli were likely not a response to stimulus motion, but a phoria initiated by the absence of a disparity signal.


Perception ◽  
2019 ◽  
Vol 48 (5) ◽  
pp. 386-401 ◽  
Author(s):  
Shinji Nakamura

When an observer sees a uniformly moving visual stimulus, he or she typically perceives an illusory motion of his or her body in the opposite direction (vection). In this study, the effects of the visual inducer’s perceived rigidity were examined using a horizontal sine wave-like line stimulus moving horizontally. Lowering the sine wave amplitude resulted in the perception of a less rigid visual stimulus motion, although the stimulus was always set to move completely rigidly. The psychophysical experiment revealed that visual self-motion perception was weaker in the lower amplitude condition where the visual stimulus was perceived as less rigid. The follow-up experiments showed that the effects of sine wave amplitude manipulation were unrelated to the modulation of the perceived speed. Furthermore, small gaps inserted into the sine waves effectively increased the perceived rigidity and resulted in a strong self-motion perception even in the lower amplitude condition. The current investigation, together with previous studies, clearly demonstrated that perceived features, in addition to the physical ones, play a key role in visual self-motion perception. Visual stimuli, perceived as more rigid, provide a more reliable frame of reference in the observers’ spatial orientation, determining their self-motion perception.


2018 ◽  
Author(s):  
William S. Tuten ◽  
Robert F. Cooper ◽  
Pavan Tiruveedhula ◽  
Alfredo Dubra ◽  
Austin Roorda ◽  
...  

AbstractPsychophysical inferences about the neural mechanisms supporting spatial vision can be undermined by uncertainties introduced by optical aberrations and fixational eye movements, particularly in fovea where the neuronal grain of the visual system is fine. We examined the effect of these pre-neural factors on photopic spatial summation in the human fovea using a custom adaptive optics scanning light ophthalmoscope that provided control over optical aberrations and retinal stimulus motion. Consistent with previous results, Ricco’s area of complete summation encompassed multiple photoreceptors when measured with ordinary amounts of ocular aberrations and retinal stimulus motion. When both factors were minimized experimentally, summation areas were essentially unchanged, suggesting that foveal spatial summation is limited by post-receptoral neural pooling. We compared our behavioral data to predictions generated with a physiologically-inspired front-end model of the visual system, and were able to capture the shape of the summation curves obtained with and without pre-retinal factors using a single post-receptoral summing filter of fixed spatial extent. Given our data and modeling, neurons in the magnocellular visual pathway, such as parasol ganglion cells, provide a candidate neural correlate of Ricco’s area in the central fovea.


2017 ◽  
Vol 236 (1) ◽  
pp. 243-252 ◽  
Author(s):  
Yoshitaka Fujii ◽  
Takeharu Seno ◽  
Robert S. Allison
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document