perceived velocity
Recently Published Documents


TOTAL DOCUMENTS

52
(FIVE YEARS 5)

H-INDEX

15
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Jane Yook ◽  
Lysha Lee ◽  
Simone Vossel ◽  
Ralph Weidner ◽  
Hinze Hogendoorn

In the flash-lag effect (FLE), a flash in spatiotemporal alignment with a moving object is often misperceived as lagging behind the moving object. One proposed explanation for the illusion is based on predictive motion extrapolation of trajectories. In this interpretation, observers require an estimate of the object′s velocity to anticipate future positions, implying that the FLE is dependent on a neural representation of perceived velocity. By contrast, alternative models of the FLE based on differential latencies or temporal averaging should not rely on such a representation of velocity. Here, we test the extrapolation account by investigating whether the FLE is sensitive to illusory changes in perceived speed when physical speed is actually constant. This was tested using rotational wedge stimuli with variable noise texture (Experiment 1) and luminance contrast (Experiment 2). We show for both manipulations, differences in perceived speed corresponded to differences in the FLE: dynamic versus static noise, and low versus high contrast stimuli led to increases in perceived speed and FLE magnitudes. These effects were consistent across different textures and were not due to low-level factors. Our results support the idea that the FLE depends on a neural representation of velocity, which is consistent with mechanisms of motion extrapolation. Hence, the faster the perceived speed, the larger the extrapolation, the stronger the flash-lag.


2020 ◽  
Vol 23 (1) ◽  
Author(s):  
Katrine Okholm Kryger ◽  
Séan Mitchell ◽  
Diwei Zhou ◽  
Steph Forrester

AbstractFootball boots are marketed with a specific performance feature focus, for example, power boots are marketed for optimal shooting performance. However, little evidence exists on the impact of boot design on shooting performance. This study assessed the effect of upper padding on shooting velocity and accuracy using a test–retest reliable test setup. Nine university level football players performed a protocol of shooting to: (1) maximise velocity; and (2) maximise accuracy in football boots with and without upper padding (Poron Memory foam). The protocol was completed twice; the non-padded boot results were used for test–retest validation, while the non-padded versus padding results were used to investigate the effect of padding. Velocity was assessed through actual ball velocity, percentage of maximum velocity and perceived velocity. Accuracy was assessed through radial offset, vertical offset, horizontal offset, success (goal/no goal), zonal offset and perceived accuracy. No significant differences between boots were observed in the velocity measures for either velocity or accuracy focused shots. Significant differences between boots were observed in vertical offset for both accuracy (without padding mean ± standard deviation − 0.02 ± 1.05 m, with padding 0.28 ± 0.87 m, P = 0.029) and velocity (without padding 0.04 ± 1.33 m, with padding 0.38 ± 0.86 m, P = 0.042) focused shots resulting in more missed shots above the goal for the padded boot (without padding 41–43% missed, with padding 56–72% missed). These findings suggest the addition of upper padding has a negative impact on shooting accuracy while not impacting shooting velocity.


2020 ◽  
Vol 33 (2) ◽  
pp. 189-212
Author(s):  
Jianying Bai ◽  
Xin He ◽  
Yi Jiang ◽  
Tao Zhang ◽  
Min Bao

Abstract As a prominent illusion, the motion aftereffect (MAE) has traditionally been considered a visual phenomenon. Recent neuroimaging work has revealed increased activities in MT+ and decreased activities in vestibular regions during the MAE, supporting the notion of visual–vestibular interaction on the MAE. Since the head had to remain stationary in fMRI experiments, vestibular self-motion signals were absent in those studies. Accordingly, more direct evidence is still lacking in terms of whether and how vestibular signals modulate the MAE. By developing a virtual reality approach, the present study for the first time demonstrates that horizontal head rotation affects the perceived velocity of the MAE. We found that the MAE was predominantly perceived as moving faster when its direction was opposite to the direction of head rotation than when its direction was the same as head rotation. The magnitude of this effect was positively correlated with the velocity of head rotation. Similar result patterns were not observed for the real motion stimuli. Our findings support a ‘cross-modal bias’ hypothesis that after living in a multisensory environment long-term the brain develops a strong association between signals from the visual and vestibular pathways. Consequently, weak biasing visual signals in the associated direction can spontaneously emerge with the input of vestibular signals in the multisensory brain areas, substantially modulating the illusory visual motion represented in those areas as well. The hypothesis can also be used to explain other multisensory integration phenomena.


2019 ◽  
Vol 122 (4) ◽  
pp. 1555-1565 ◽  
Author(s):  
Alessandro Moscatelli ◽  
Cecile R. Scotto ◽  
Marc O. Ernst

In vision, the perceived velocity of a moving stimulus differs depending on whether we pursue it with the eyes or not: A stimulus moving across the retina with the eyes stationary is perceived as being faster compared with a stimulus of the same physical speed that the observer pursues with the eyes, while its retinal motion is zero. This effect is known as the Aubert–Fleischl phenomenon. Here, we describe an analog phenomenon in touch. We asked participants to estimate the speed of a moving stimulus either from tactile motion only (i.e., motion across the skin), while keeping the hand world stationary, or from kinesthesia only by tracking the stimulus with a guided arm movement, such that the tactile motion on the finger was zero (i.e., only finger motion but no movement across the skin). Participants overestimated the velocity of the stimulus determined from tactile motion compared with kinesthesia in analogy with the visual Aubert–Fleischl phenomenon. In two follow-up experiments, we manipulated the stimulus noise by changing the texture of the touched surface. Similarly to the visual phenomenon, this significantly affected the strength of the illusion. This study supports the hypothesis of shared computations for motion processing between vision and touch. NEW & NOTEWORTHY In vision, the perceived velocity of a moving stimulus is different depending on whether we pursue it with the eyes or not, an effect known as the Aubert–Fleischl phenomenon. We describe an analog phenomenon in touch. We asked participants to estimate the speed of a moving stimulus either from tactile motion or by pursuing it with the hand. Participants overestimated the stimulus velocity measured from tactile motion compared with kinesthesia, in analogy with the visual Aubert–Fleischl phenomenon.


Author(s):  
Takaaki Yasui ◽  
Fumihiro Akatsuka ◽  
Yoshihiko Nomura ◽  
Tokuhiro Sugiura

In recent years, the methods of motor learning using haptic devices that can give motion-related stimuli to learners have been studied. In order to design control systems of the haptic devices that can give learners stimuli so that they can perceive them with proprioception, we need to understand the characteristics of human’s position and velocity sensations. Then, in this study, we examined velocity JNDs (Just Noticeable Differences), in order to understand human velocity-change perception. We, in particular, focused on an effect of acceleration during velocity-change to human velocity-change perception. In the experiment, we enforced subjects to accelerate their hands with a constant acceleration of 1, 8, 16, 32 deg/s2 from before-acceleration velocity of 10 deg/s. Subjects answered whether they perceived velocity-change or not, and we measured velocity JNDs. As a result, it was found that, while the accelerations increased by 32 times, the velocity JNDs decreased by only about 1/2, i.e., from 8.1 to 4.2 deg/s. From this result, it was concluded that the magnitude of acceleration is not a determinative factor for velocity-change perception but a supplementary one.


2017 ◽  
Author(s):  
W. Owen Brimijoin

AbstractThe minimum audible movement angle increases as a function of source azimuth. If listeners do not perceptually compensate for this change in acuity, then sounds rotating around the head should appear to move faster at the front than at the side. We examined whether judgments of relative amounts of acoustic motion depend on signal center angle and found that the azimuth of two signals strongly affects their point of subjective similarity for motion. Signal motion centered at 90° had to be roughly twice as large as motion centered at 0° to be judged as equivalent. This distortion of acoustic space around the listener suggests that the perceived velocity of moving sound sources changes as a function of azimuth around the head. The “equivalent arc ratio,” a mathematical framework based on these results, is used to successfully provide quantitative explanations for previously documented discrepancies in spatial localization, motion perception, and head-to-world coordinate transformations.


2015 ◽  
Vol 114 (1) ◽  
pp. 264-273 ◽  
Author(s):  
Ryan M. Peters ◽  
Brandon G. Rasman ◽  
J. Timothy Inglis ◽  
Jean-Sébastien Blouin

Galvanic vestibular stimulation (GVS) evokes a perception of rotation; however, very few quantitative data exist on the matter. We performed psychophysical experiments on virtual rotations experienced when binaural bipolar electrical stimulation is applied over the mastoids. We also performed analogous real whole body yaw rotation experiments, allowing us to compare the frequency response of vestibular perception with (real) and without (virtual) natural mechanical stimulation of the semicircular canals. To estimate the gain of vestibular perception, we measured direction discrimination thresholds for virtual and real rotations. Real direction discrimination thresholds decreased at higher frequencies, confirming multiple previous studies. Conversely, virtual direction discrimination thresholds increased at higher frequencies, implying low-pass filtering of the virtual perception process occurring potentially anywhere between afferent transduction and cortical responses. To estimate the phase of vestibular perception, participants manually tracked their perceived position during sinusoidal virtual and real kinetic stimulation. For real rotations, perceived velocity was approximately in phase with actual velocity across all frequencies. Perceived virtual velocity was in phase with the GVS waveform at low frequencies (0.05 and 0.1 Hz). As frequency was increased to 1 Hz, the phase of perceived velocity advanced relative to the GVS waveform. Therefore, at low frequencies GVS is interpreted as an angular velocity signal and at higher frequencies GVS becomes interpreted increasingly as an angular position signal. These estimated gain and phase spectra for vestibular perception are a first step toward generating well-controlled virtual vestibular percepts, an endeavor that may reveal the usefulness of GVS in the areas of clinical assessment, neuroprosthetics, and virtual reality.


2014 ◽  
Vol 67 (3) ◽  
pp. 455-473 ◽  
Author(s):  
Alexis D. J. Makin ◽  
Rebecca Lawson ◽  
Marco Bertamini ◽  
Jayne Pickering
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document