simultaneity judgment
Recently Published Documents


TOTAL DOCUMENTS

17
(FIVE YEARS 4)

H-INDEX

6
(FIVE YEARS 0)

PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0261129
Author(s):  
Yasuhiro Takeshima

Audio-visual integration relies on temporal synchrony between visual and auditory inputs. However, differences in traveling and transmitting speeds between visual and auditory stimuli exist; therefore, audio-visual synchrony perception exhibits flexible functions. The processing speed of visual stimuli affects the perception of audio-visual synchrony. The present study examined the effects of visual fields, in which visual stimuli are presented, for the processing of audio-visual temporal synchrony. The point of subjective simultaneity, the temporal binding window, and the rapid recalibration effect were measured using temporal order judgment, simultaneity judgment, and stream/bounce perception, because different mechanisms of temporal processing have been suggested among these three paradigms. The results indicate that auditory stimuli should be presented earlier for visual stimuli in the central visual field than in the peripheral visual field condition in order to perceive subjective simultaneity in the temporal order judgment task conducted in this study. Meanwhile, the subjective simultaneity bandwidth was broader in the central visual field than in the peripheral visual field during the simultaneity judgment task. In the stream/bounce perception task, neither the point of subjective simultaneity nor the temporal binding window differed between the two types of visual fields. Moreover, rapid recalibration occurred in both visual fields during the simultaneity judgment tasks. However, during the temporal order judgment task and stream/bounce perception, rapid recalibration occurred only in the central visual field. These results suggest that differences in visual processing speed based on the visual field modulate the temporal processing of audio-visual stimuli. Furthermore, these three tasks, temporal order judgment, simultaneity judgment, and stream/bounce perception, each have distinct functional characteristics for audio-visual synchrony perception. Future studies are necessary to confirm the effects of compensation regarding differences in the temporal resolution of the visual field in later cortical visual pathways on visual field differences in audio-visual temporal synchrony.


2021 ◽  
pp. 216770262110315
Author(s):  
Han-yu Zhou ◽  
Xi-long Cui ◽  
Bin-rang Yang ◽  
Li-juan Shi ◽  
Xue-rong Luo ◽  
...  

Impaired audiovisual temporal integration, manifested as an abnormally widened temporal-binding window (TBW) for integrating sensory information, is found in both autism spectrum disorder (ASD) and schizophrenia (SCZ) and contributes to aberrant perceptual experiences and impaired social communication. We conducted two experiments using age-comparable samples of participants with early-onset SCZ and participants with ASD. Sophisticated paradigms, including a unisensory temporal-order-judgment task (TOJ), an audiovisual-simultaneity-judgment task (SJ), and an eye-tracking task were used. Results showed generalized deficits in temporal processing in SCZ ranging from unisensory to multisensory modalities and from nonspeech to speech stimuli. In contrast, the widened TBW in ASD mainly affected speech stimuli processing. Applying the eye-tracking task with ecologically valid linguistic stimuli, we found that both participants with SCZ and participants with ASD exhibited reduced sensitivity of detecting audiovisual speech asynchrony. This impaired audiovisual speech integration correlated with negative symptoms. Although both ASD and SCZ have impaired multisensory temporal integration, ASD impairs speech-related processing, and SCZ is associated with generalized deficits.


PLoS ONE ◽  
2021 ◽  
Vol 16 (7) ◽  
pp. e0253130
Author(s):  
Nina Heins ◽  
Jennifer Pomp ◽  
Daniel S. Kluger ◽  
Stefan Vinbrüx ◽  
Ima Trempler ◽  
...  

Auditory and visual percepts are integrated even when they are not perfectly temporally aligned with each other, especially when the visual signal precedes the auditory signal. This window of temporal integration for asynchronous audiovisual stimuli is relatively well examined in the case of speech, while other natural action-induced sounds have been widely neglected. Here, we studied the detection of audiovisual asynchrony in three different whole-body actions with natural action-induced sounds–hurdling, tap dancing and drumming. In Study 1, we examined whether audiovisual asynchrony detection, assessed by a simultaneity judgment task, differs as a function of sound production intentionality. Based on previous findings, we expected that auditory and visual signals should be integrated over a wider temporal window for actions creating sounds intentionally (tap dancing), compared to actions creating sounds incidentally (hurdling). While percentages of perceived synchrony differed in the expected way, we identified two further factors, namely high event density and low rhythmicity, to induce higher synchrony ratings as well. Therefore, we systematically varied event density and rhythmicity in Study 2, this time using drumming stimuli to exert full control over these variables, and the same simultaneity judgment tasks. Results suggest that high event density leads to a bias to integrate rather than segregate auditory and visual signals, even at relatively large asynchronies. Rhythmicity had a similar, albeit weaker effect, when event density was low. Our findings demonstrate that shorter asynchronies and visual-first asynchronies lead to higher synchrony ratings of whole-body action, pointing to clear parallels with audiovisual integration in speech perception. Overconfidence in the naturally expected, that is, synchrony of sound and sight, was stronger for intentional (vs. incidental) sound production and for movements with high (vs. low) rhythmicity, presumably because both encourage predictive processes. In contrast, high event density appears to increase synchronicity judgments simply because it makes the detection of audiovisual asynchrony more difficult. More studies using real-life audiovisual stimuli with varying event densities and rhythmicities are needed to fully uncover the general mechanisms of audiovisual integration.


2020 ◽  
Vol 41 (4) ◽  
pp. 686-688
Author(s):  
Satoshi Okazaki ◽  
Makoto Ichikawa ◽  
Minoru Tsuzaki

2018 ◽  
Vol 61 (3) ◽  
pp. 789-796 ◽  
Author(s):  
Shunsuke Tamura ◽  
Kazuhito Ito ◽  
Nobuyuki Hirose ◽  
Shuji Mori

Purpose The purpose of this study was to investigate the psychophysical boundary used for categorization of voiced–voiceless stop consonants in native Japanese speakers. Method Twelve native Japanese speakers participated in the experiment. The stimuli were synthetic stop consonant–vowel stimuli varying in voice onset time (VOT) with manipulation of the amplitude of the initial noise portion and the first formant (F1) frequency of the periodic portion. There were 3 tasks, namely, speech identification to either /d/ or /t/, detection of the noise portion, and simultaneity judgment of onsets of the noise and periodic portions. Results The VOT boundaries of /d/–/t/ were close to the shortest VOT values that allowed for detection of the noise portion but not to those for perceived nonsimultaneity of the noise and periodic portions. The slopes of noise detection functions along VOT were as sharp as those of voiced–voiceless identification functions. In addition, the effects of manipulating the amplitude of the noise portion and the F1 frequency of the periodic portion on the detection of the noise portion were similar to those on voiced–voiceless identification. Conclusion The psychophysical boundary of perception of the initial noise portion masked by the following periodic portion may be used for voiced–voiceless categorization by Japanese speakers.


2017 ◽  
Author(s):  
Sofia Isaksson ◽  
Susanna Salomäki ◽  
Jarno Tuominen ◽  
Valtteri Arstila ◽  
Christine M. Falter ◽  
...  

Background: Individuals with ASD have abnormal motor and perceptual functions that do not currently form diagnostic criteria of ASD, but nevertheless may affect every day behavior. Temporal processing seems to be one of such non-diagnostic yet impaired domains, although the lack of systematic studies testing different aspects of timing in the same sample of participants prevents a conclusive assessment of whether there is a generalized temporal deficit in ASD associated with diagnostic symptoms. Methods: 17 children diagnosed with ASD and 18 typically developing age- and IQ-matched developing controls carried out a set of motor and perceptual timing tasks: free tapping, simultaneity judgment, auditory duration discrimination, and verbal time estimation. Parents of participants filled in a questionnaire assessing the sense and management of time. Results: Children with ASD showed faster and more variable free tapping than controls. Auditory duration discrimination thresholds were higher in the ASD group than controls in a sub-second version of the task, while there were no group differences in a supra-second discrimination of intervals. Children with ASD showed more variable thresholds of simultaneity judgment, and they received lower parental scores for their sense and management of time. No group differences were observed in the verbal time estimation task in the minute-range. Different timing functions were correlated in the ASD group but not among controls, whilst several timing measures correlated with ASD symptoms. Conclusions: Children with ASD show a generalized temporal deficit spanning a range of temporal processing tasks including motor timing, perceptual timing, and temporal perspective.


2017 ◽  
Vol 29 (5) ◽  
pp. 805-815 ◽  
Author(s):  
Sara Agosta ◽  
Denise Magnago ◽  
Sarah Tyler ◽  
Emily Grossman ◽  
Emanuela Galante ◽  
...  

The visual system is extremely efficient at detecting events across time even at very fast presentation rates; however, discriminating the identity of those events is much slower and requires attention over time, a mechanism with a much coarser resolution [Cavanagh, P., Battelli, L., & Holcombe, A. O. Dynamic attention. In A. C. Nobre & S. Kastner (Eds.), The Oxford handbook of attention (pp. 652–675). Oxford: Oxford University Press, 2013]. Patients affected by right parietal lesion, including the TPJ, are severely impaired in discriminating events across time in both visual fields [Battelli, L., Cavanagh, P., & Thornton, I. M. Perception of biological motion in parietal patients. Neuropsychologia, 41, 1808–1816, 2003]. One way to test this ability is to use a simultaneity judgment task, whereby participants are asked to indicate whether two events occurred simultaneously or not. We psychophysically varied the frequency rate of four flickering disks, and on most of the trials, one disk (either in the left or right visual field) was flickering out-of-phase relative to the others. We asked participants to report whether two left-or-right-presented disks were simultaneous or not. We tested a total of 23 right and left parietal lesion patients in Experiment 1, and only right parietal patients showed impairment in both visual fields while their low-level visual functions were normal. Importantly, to causally link the right TPJ to the relative timing processing, we ran a TMS experiment on healthy participants. Participants underwent three stimulation sessions and performed the same simultaneity judgment task before and after 20 min of low-frequency inhibitory TMS over right TPJ, left TPJ, or early visual area as a control. rTMS over the right TPJ caused a bilateral impairment in the simultaneity judgment task, whereas rTMS over left TPJ or over early visual area did not affect performance. Altogether, our results directly link the right TPJ to the processing of relative time.


PLoS ONE ◽  
2016 ◽  
Vol 11 (8) ◽  
pp. e0161698 ◽  
Author(s):  
Jean-Paul Noel ◽  
Matthew De Niear ◽  
Erik Van der Burg ◽  
Mark T. Wallace

2016 ◽  
Vol 7 ◽  
Author(s):  
Kielan Yarrow ◽  
Sian E. Martin ◽  
Steven Di Costa ◽  
Joshua A. Solomon ◽  
Derek H. Arnold

Sign in / Sign up

Export Citation Format

Share Document