scholarly journals A Roving Dual-Presentation Simultaneity-Judgment Task to Estimate the Point of Subjective Simultaneity

2016 ◽  
Vol 7 ◽  
Author(s):  
Kielan Yarrow ◽  
Sian E. Martin ◽  
Steven Di Costa ◽  
Joshua A. Solomon ◽  
Derek H. Arnold
2015 ◽  
Vol 28 (3-4) ◽  
pp. 351-370 ◽  
Author(s):  
Hao Tam Ho ◽  
Hao Tam Ho ◽  
Emily Orchard-Mills ◽  
Hao Tam Ho ◽  
Emily Orchard-Mills ◽  
...  

Following prolonged exposure to audiovisual asynchrony, an observer’s point of subjective simultaneity (PSS) shifts in the direction of the leading modality. It has been debated whether other sensory pairings, such as vision and touch, lead to a similar temporal recalibration, and if so, whether the internal timing mechanism underlying lag visuotactile adaptation is centralised or distributed. To address these questions, we adapted observers to vision- and tactile-leading visuotactile asynchrony on either their left or right hand side in different blocks. In one test condition, participants performed a simultaneity judgment on the adapted side (unilateral) and in another they performed a simultaneity judgment on the non-adapted side (contralateral). In a third condition, participants adapted concurrently to equal and opposite asynchronies on each side and were tested randomly on either hand (bilateral opposed). Results from the first two conditions show that observers recalibrate to visuotactile asynchronies, and that the recalibration transfers to the non-adapted side. These findings suggest a centralised recalibration mechanism not linked to the adapted side and predict no recalibration for the bilateral opposed condition, assuming the adapted effects were equal on each side. This was confirmed in the group of participants that adapted to vision- and tactile-leading asynchrony on the right and left hand side, respectively. However, the other group (vision-leading on the left and tactile-leading on the right) did show a recalibration effect, suggesting a distributed mechanism. We discuss these findings in terms of a hybrid model that assumes the co-existence of a centralised and distributed timing mechanism.


PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0261129
Author(s):  
Yasuhiro Takeshima

Audio-visual integration relies on temporal synchrony between visual and auditory inputs. However, differences in traveling and transmitting speeds between visual and auditory stimuli exist; therefore, audio-visual synchrony perception exhibits flexible functions. The processing speed of visual stimuli affects the perception of audio-visual synchrony. The present study examined the effects of visual fields, in which visual stimuli are presented, for the processing of audio-visual temporal synchrony. The point of subjective simultaneity, the temporal binding window, and the rapid recalibration effect were measured using temporal order judgment, simultaneity judgment, and stream/bounce perception, because different mechanisms of temporal processing have been suggested among these three paradigms. The results indicate that auditory stimuli should be presented earlier for visual stimuli in the central visual field than in the peripheral visual field condition in order to perceive subjective simultaneity in the temporal order judgment task conducted in this study. Meanwhile, the subjective simultaneity bandwidth was broader in the central visual field than in the peripheral visual field during the simultaneity judgment task. In the stream/bounce perception task, neither the point of subjective simultaneity nor the temporal binding window differed between the two types of visual fields. Moreover, rapid recalibration occurred in both visual fields during the simultaneity judgment tasks. However, during the temporal order judgment task and stream/bounce perception, rapid recalibration occurred only in the central visual field. These results suggest that differences in visual processing speed based on the visual field modulate the temporal processing of audio-visual stimuli. Furthermore, these three tasks, temporal order judgment, simultaneity judgment, and stream/bounce perception, each have distinct functional characteristics for audio-visual synchrony perception. Future studies are necessary to confirm the effects of compensation regarding differences in the temporal resolution of the visual field in later cortical visual pathways on visual field differences in audio-visual temporal synchrony.


i-Perception ◽  
2021 ◽  
Vol 12 (4) ◽  
pp. 204166952110329
Author(s):  
Aditi Jublie ◽  
Devpriya Kumar

Earlier work on self-face processing has reported a bias in the processing of self-face result in faster response to self-face in comparison to other familiar and unfamiliar faces (termed as self-face advantage or SFA). Even though most studies agree that the SFA occurs due to an attentional bias, there is little agreement regarding the stage at which it occurs. While a large number of studies show self-face influencing processing later at disengagement stage, early event-related potential components show differential activity for the self-face suggesting that SFA occurs early. We address this contradiction using a cueless temporal order judgment task that allows us to investigate early perceptual processing, while bias due to top-down expectation is controlled. A greater shift in point of subjective simultaneity for self-face would indicate a greater processing advantage at early perceptual stage. With help of two experiments, we show an early perceptual advantage for self-face, compared to both a friend’s face and an unfamiliar face (Experiment 1). This advantage is present even when the effect of criterion shift is minimized (Experiment 2). Interestingly, the magnitude of advantage is similar for self-friend and self-unfamiliar pair. The evidence from the two experiments suggests early capture of attention as a likely reason for the SFA, which is present for the self-face but not for other familiar faces.


Perception ◽  
1993 ◽  
Vol 22 (8) ◽  
pp. 963-970 ◽  
Author(s):  
Piotr Jaśkowski

Point of subjective simultaneity and simple reaction time were compared for stimuli with different rise times. It was found that these measures behave differently. To explain the result it is suggested that in the case of temporal-order judgment the subject takes into account not only the stimulus onset but also other events connected with stimulus presentation.


Perception ◽  
1991 ◽  
Vol 20 (6) ◽  
pp. 715-726 ◽  
Author(s):  
Piotr Jaśkowski

Temporal-order judgment was investigated for a pair of visual stimuli with different durations in order to check whether offset asynchrony can disturb the perception of the order/simultaneity of onset. In experiment 1 the point of subjective simultaneity was estimated by the method of adjustment. The difference in duration of the two stimuli in the pair was either 0 or 50 ms. It was found that the subject shifts the onset of the shorter stimulus towards the offset of the longer one to obtain a satisfying impression of simultaneity even though the subject was asked to ignore the events concerning the stimulus offset. In experiments 2 and 3 the method of constant stimulus was applied. Both experiments indicate that subjects, in spite of instruction, take into account the offset asynchrony in their judgment.


2020 ◽  
Vol 41 (4) ◽  
pp. 686-688
Author(s):  
Satoshi Okazaki ◽  
Makoto Ichikawa ◽  
Minoru Tsuzaki

2012 ◽  
Vol 25 (0) ◽  
pp. 153
Author(s):  
Alex K. Malone ◽  
Nai-Yuan N. Chang ◽  
Timothy E. Hullar

Falls are one of the leading causes of disability in the elderly. Previous research has shown that falls may be related to changes in the temporal integration of multisensory stimuli. This study compared the temporal integration and processing of a vestibular and auditory stimulus in younger and older subjects. The vestibular stimulus consisted of a continuous sinusoidal rotational velocity delivered using a rotational chair and the auditory stimulus consisted of 5 ms of white noise presented dichotically through headphones (both at 0.5 Hz). Simultaneity was defined as perceiving the chair being at its furthest rightward or leftward trajectory at the same moment as the auditory stimulus was perceived in the contralateral ear. The temporal offset of the auditory stimulus was adjusted using a method of constant stimuli so that the auditory stimulus either led or lagged true simultaneity. 15 younger (ages 21–27) and 12 older (ages 63–89) healthy subjects were tested using a two alternative forced choice task to determine at what times they perceived the two stimuli as simultaneous. Younger subjects had a mean temporal binding window of 334 ± 37 ms (mean ± SEM) and a mean point of subjective simultaneity of 83 ± 15 ms. Older subjects had a mean TBW of 556 ± 36 ms and a mean point of subjective simultaneity of 158 ± 27. Both differences were significant indicating that older subjects have a wider temporal range over which they integrate vestibular and auditory stimuli than younger subjects. These findings were consistent upon retesting and were not due to differences in vestibular perception thresholds.


2015 ◽  
Vol 282 (1804) ◽  
pp. 20143083 ◽  
Author(s):  
Erik Van der Burg ◽  
Patrick T. Goodbourn

The brain is adaptive. The speed of propagation through air, and of low-level sensory processing, differs markedly between auditory and visual stimuli; yet the brain can adapt to compensate for the resulting cross-modal delays. Studies investigating temporal recalibration to audiovisual speech have used prolonged adaptation procedures, suggesting that adaptation is sluggish. Here, we show that adaptation to asynchronous audiovisual speech occurs rapidly. Participants viewed a brief clip of an actor pronouncing a single syllable. The voice was either advanced or delayed relative to the corresponding lip movements, and participants were asked to make a synchrony judgement. Although we did not use an explicit adaptation procedure, we demonstrate rapid recalibration based on a single audiovisual event. We find that the point of subjective simultaneity on each trial is highly contingent upon the modality order of the preceding trial. We find compelling evidence that rapid recalibration generalizes across different stimuli, and different actors. Finally, we demonstrate that rapid recalibration occurs even when auditory and visual events clearly belong to different actors. These results suggest that rapid temporal recalibration to audiovisual speech is primarily mediated by basic temporal factors, rather than higher-order factors such as perceived simultaneity and source identity.


2020 ◽  
Author(s):  
Aaron Nidiffer ◽  
Ramnarayan Ramachandran ◽  
Mark Wallace

Our perceptual system is adept at exploiting sensory regularities to better extract information about our environment. One clear example of this is how the sensory and multisensory systems can utilize consistency to group sensory features into a perceptual object and segregate objects from each other and background noise. Leveraging tenets of object-based attention and multisensory binding, we sought whether this ability scaled with the strength of that consistency. We presented participants with amplitude modulated (AM) auditory and visual streams and asked them to detect imbedded orthogonal, near-threshold frequency modulation (FM) events. We modulated the correlation of the streams by varying the phase of the visual AM. In line with a previous report, we first observed peak performance that was shifted from 0°. After accounting for this, we found that across participants discriminability of the FM event linearly improved with correlation. Additionally, we sought to answer a question left dangling from our previous report as to the possible explanation for the phase shift. We found that phase shift correlated with auditory and visual response time differences, but not point of subjective simultaneity, suggesting differences in sensory processing times may account for the observed phase shift. These results suggest that our perceptual system can bind multisensory features across a spectrum of temporal correlations, a process necessary for multisensory binding in complex environments where unrelated signals may have small errant correlations.


Sign in / Sign up

Export Citation Format

Share Document