point of subjective simultaneity
Recently Published Documents


TOTAL DOCUMENTS

19
(FIVE YEARS 5)

H-INDEX

6
(FIVE YEARS 0)

2021 ◽  
Author(s):  
Elyse G Letts ◽  
Aysha Basharat ◽  
Michael Barnett-Cowan

Previous studies demonstrate that semantics, the higher level meaning of multi-modal stimuli, can impact multisensory integration. Valence, an affective response to images, has not yet been tested in non-priming response time (RT) or temporal order judgement (TOJ) tasks. This study aims to investigate both semantic congruency and valence of non-speech audiovisual stimuli on multisensory integration via RT and TOJ tasks (assessing processing speed (RT), point of subjective simultaneity (PSS), and time-window when multisensory stimuli are likely to be perceived as simultaneous (Temporal Binding Window; TBW)). Forty participants (mean age: 26.25; females=17) were recruited from Prolific Academic resulting in 37 complete datasets. Both congruence and valence have a significant main effect on RT (congruent and high valence decrease RT) as well as an interaction effect (congruent/high valence condition being significantly faster than all others). For TOJ, images high in valence require visual stimuli to be presented significantly earlier than auditory stimuli in order for the audio and visual stimuli to be perceived as simultaneous. Further, a significant interaction effect of congruence and valence on the PSS revealed that the congruent/high valence condition was significantly earlier than all other conditions. A subsequent analysis shows there is a positive correlation between the TBW width (b-values) and RT (as the TBW widens, the RT increases) for the categories that differed most from 0 in their PSS (Congruent/High and Incongruent/Low). This study provides new evidence that supports previous research on semantic congruency and presents a novel incorporation of valence into behavioural responses.


i-Perception ◽  
2021 ◽  
Vol 12 (4) ◽  
pp. 204166952110329
Author(s):  
Aditi Jublie ◽  
Devpriya Kumar

Earlier work on self-face processing has reported a bias in the processing of self-face result in faster response to self-face in comparison to other familiar and unfamiliar faces (termed as self-face advantage or SFA). Even though most studies agree that the SFA occurs due to an attentional bias, there is little agreement regarding the stage at which it occurs. While a large number of studies show self-face influencing processing later at disengagement stage, early event-related potential components show differential activity for the self-face suggesting that SFA occurs early. We address this contradiction using a cueless temporal order judgment task that allows us to investigate early perceptual processing, while bias due to top-down expectation is controlled. A greater shift in point of subjective simultaneity for self-face would indicate a greater processing advantage at early perceptual stage. With help of two experiments, we show an early perceptual advantage for self-face, compared to both a friend’s face and an unfamiliar face (Experiment 1). This advantage is present even when the effect of criterion shift is minimized (Experiment 2). Interestingly, the magnitude of advantage is similar for self-friend and self-unfamiliar pair. The evidence from the two experiments suggests early capture of attention as a likely reason for the SFA, which is present for the self-face but not for other familiar faces.


2021 ◽  
Author(s):  
Kyuto Uno ◽  
Kazuhiko Yokosawa

Audiovisual temporal recalibration refers to a shift in the point of subjective simultaneity (PSS) between audio and visual signals triggered by prolonged exposure to asynchronies between these signals. Previous research indicated that the spatial proximity of audiovisual signals can be a determinant of which pairs of signals are temporally recalibrated when multiple events compete for recalibration. Here we show that temporal recalibration is modulated by an observer’s assumption that the audiovisual signals originate from the same unitary event (“unity assumption”). Participants were shown alternating face photos and voices of the male and female speakers. These stimuli were presented equally spaced in time, and the voices were presented monaurally through headphones, such that no spatiotemporal-based grouping was implied for these stimuli. There were two conditions for the stimulus sequence in the adaptation phase: one in which a face photo always preceded its corresponding voice within each pairing of audiovisual stimuli (i.e., multiple repetitions of the sequence: female voice – male face – male voice – female voice), and the other one in which the corresponding voice always preceded its face photo. We found a shift in the PSS between these audiovisual signals towards the temporal order for the same gender person. The results show that the unity assumption between face photos and voices affects temporal recalibration, indicating the possibility that the brain selectively recalibrates the asynchronies of audiovisual signals that are considered to originate from the same unitary event in a cluttered environment.


2021 ◽  
pp. 1-18
Author(s):  
Ramiro Joly-Mascheroni ◽  
Sonia Abad-Hernando ◽  
Bettina Forster ◽  
Beatriz Calvo-Merino

Abstract The concept of embodiment has been used in multiple scenarios, but in cognitive neuroscience it normally refers to the comprehension of the role of one’s own body in the cognition of everyday situations and the processes involved in that perception. Multisensory research is gradually embracing the concept of embodiment, but the focus has mostly been concentrated upon audiovisual integration. In two experiments, we evaluated how the likelihood of a perceived stimulus to be embodied modulates visuotactile interaction in a Simultaneity Judgement task. Experiment 1 compared the perception of two visual stimuli with and without biological attributes (hands and geometrical shapes) moving towards each other, while tactile stimuli were provided on the palm of the participants’ hand. Participants judged whether the meeting point of two periodically-moving visual stimuli was synchronous with the tactile stimulation in their own hands. Results showed that in the hand condition, the Point of Subjective Simultaneity (PSS) was significantly more distant to real synchrony (60 ms after the Stimulus Onset Asynchrony, SOA) than in the geometrical shape condition (45 ms after SOA). In experiment 2, we further explored the impact of biological attributes by comparing performance on two visual biological stimuli (hands and ears), that also vary in their motor and visuotactile properties. Results showed that the PSS was equally distant to real synchrony in both the hands and ears conditions. Overall, findings suggest that embodied visual biological stimuli may modulate visual and tactile multisensory interaction in simultaneity judgements.


2020 ◽  
Author(s):  
Aaron Nidiffer ◽  
Ramnarayan Ramachandran ◽  
Mark Wallace

Our perceptual system is adept at exploiting sensory regularities to better extract information about our environment. One clear example of this is how the sensory and multisensory systems can utilize consistency to group sensory features into a perceptual object and segregate objects from each other and background noise. Leveraging tenets of object-based attention and multisensory binding, we sought whether this ability scaled with the strength of that consistency. We presented participants with amplitude modulated (AM) auditory and visual streams and asked them to detect imbedded orthogonal, near-threshold frequency modulation (FM) events. We modulated the correlation of the streams by varying the phase of the visual AM. In line with a previous report, we first observed peak performance that was shifted from 0°. After accounting for this, we found that across participants discriminability of the FM event linearly improved with correlation. Additionally, we sought to answer a question left dangling from our previous report as to the possible explanation for the phase shift. We found that phase shift correlated with auditory and visual response time differences, but not point of subjective simultaneity, suggesting differences in sensory processing times may account for the observed phase shift. These results suggest that our perceptual system can bind multisensory features across a spectrum of temporal correlations, a process necessary for multisensory binding in complex environments where unrelated signals may have small errant correlations.


2018 ◽  
Vol 72 (3) ◽  
pp. 589-598 ◽  
Author(s):  
Merryn D Constable ◽  
Timothy N Welsh ◽  
Greg Huffman ◽  
Jay Pratt

A multitude of studies demonstrate that self-relevant stimuli influence attention. Self-owned objects are a special class of self-relevant stimuli. If a self-owned object can indeed be characterised as a self-relevant stimulus then, consistent with theoretical predictions, a behavioural effect of ownership on attention should be present. To test this prediction, a task was selected that is known to be particularly sensitive measure of the prioritisation of visual information: the temporal order judgement. Participants completed temporal order judgements with pictures of “own” and “experimenter” owned objects (mugs) presented on either side of a central fixation cross. There was a variable onset delay between each picture, ranging between 0 ms and 105 ms, and participants were asked to indicate which mug appeared first. The results indicated a reliable change in the point of subjective simultaneity (PSS) in favour of their own mug. Such a change in the PSS was not observed for two groups of participants who were exposed to a mug but did not keep the mug. A further experiment indicated that the source of the bias in PSS was more consistent with a criterion shift or top-down attentional prioritisation rather than a perceptual bias. These findings suggest that ownership, beyond mere-touch, mere-choice, or familiarity, leads to prioritised processing and responses, but the mechanism underlying the effect is not likely to be perceptual in nature.


2018 ◽  
Author(s):  
Carolin Sachgau ◽  
William Chung ◽  
Michael Barnett-Cowan

AbstractThe central nervous system must determine which sensory events occur at the same time. Actively moving the head corresponds with large changes in the relationship between the observer and the environment, sensorimotor processing, and spatiotemporal perception. Numerous studies have shown that head movement onset must precede the onset of other sensory events in order to be perceived as simultaneous, indicating that head movement perception is slow. Active head movement perception has been shown to be slower than passive head movement perception and dependent on head movement velocity, where participants who move their head faster than other participants require the head to move even earlier than comparison stimuli to be perceived as simultaneous. These results suggest that head movement perception is slower (i.e., suppressed) when the head moves faster. The present study used a within-subjects design to measure the point of subjective simultaneity (PSS) between active head movement speeds and a comparison sound stimulus. Our results clearly show that i) head movement perception is faster when the head moves faster within-subjects, ii) active head movement onset must still precede the onset of other sensory events (Average PSS: -123 to -52 ms) in order to be perceived as occurring simultaneously even at the fastest speeds (Average peak velocity: 76°/s to 257°/s). We conclude that head movement perception is slow, but that this delay is minimized with increased speed. While we do not provide evidence against sensory suppression, which requires active versus passive head movement comparison, our results do rule out velocity-based suppression.


2017 ◽  
Vol 29 (9) ◽  
pp. 1566-1582 ◽  
Author(s):  
Laetitia Grabot ◽  
Anne Kösem ◽  
Leila Azizi ◽  
Virginie van Wassenhove

Perceiving the temporal order of sensory events typically depends on participants' attentional state, thus likely on the endogenous fluctuations of brain activity. Using magnetoencephalography, we sought to determine whether spontaneous brain oscillations could disambiguate the perceived order of auditory and visual events presented in close temporal proximity, that is, at the individual's perceptual order threshold (Point of Subjective Simultaneity [PSS]). Two neural responses were found to index an individual's temporal order perception when contrasting brain activity as a function of perceived order (i.e., perceiving the sound first vs. perceiving the visual event first) given the same physical audiovisual sequence. First, average differences in prestimulus auditory alpha power indicated perceiving the correct ordering of audiovisual events irrespective of which sensory modality came first: a relatively low alpha power indicated perceiving auditory or visual first as a function of the actual sequence order. Additionally, the relative changes in the amplitude of the auditory (but not visual) evoked responses were correlated with participant's correct performance. Crucially, the sign of the magnitude difference in prestimulus alpha power and evoked responses between perceived audiovisual orders correlated with an individual's PSS. Taken together, our results suggest that spontaneous oscillatory activity cannot disambiguate subjective temporal order without prior knowledge of the individual's bias toward perceiving one or the other sensory modality first. Altogether, our results suggest that, under high perceptual uncertainty, the magnitude of prestimulus alpha (de)synchronization indicates the amount of compensation needed to overcome an individual's prior in the serial ordering and temporal sequencing of information.


2016 ◽  
Vol 7 ◽  
Author(s):  
Kielan Yarrow ◽  
Sian E. Martin ◽  
Steven Di Costa ◽  
Joshua A. Solomon ◽  
Derek H. Arnold

2015 ◽  
Vol 282 (1804) ◽  
pp. 20143083 ◽  
Author(s):  
Erik Van der Burg ◽  
Patrick T. Goodbourn

The brain is adaptive. The speed of propagation through air, and of low-level sensory processing, differs markedly between auditory and visual stimuli; yet the brain can adapt to compensate for the resulting cross-modal delays. Studies investigating temporal recalibration to audiovisual speech have used prolonged adaptation procedures, suggesting that adaptation is sluggish. Here, we show that adaptation to asynchronous audiovisual speech occurs rapidly. Participants viewed a brief clip of an actor pronouncing a single syllable. The voice was either advanced or delayed relative to the corresponding lip movements, and participants were asked to make a synchrony judgement. Although we did not use an explicit adaptation procedure, we demonstrate rapid recalibration based on a single audiovisual event. We find that the point of subjective simultaneity on each trial is highly contingent upon the modality order of the preceding trial. We find compelling evidence that rapid recalibration generalizes across different stimuli, and different actors. Finally, we demonstrate that rapid recalibration occurs even when auditory and visual events clearly belong to different actors. These results suggest that rapid temporal recalibration to audiovisual speech is primarily mediated by basic temporal factors, rather than higher-order factors such as perceived simultaneity and source identity.


Sign in / Sign up

Export Citation Format

Share Document