scholarly journals Multisensory binding is driven by the strength of stimulus correlation

2020 ◽  
Author(s):  
Aaron Nidiffer ◽  
Ramnarayan Ramachandran ◽  
Mark Wallace

Our perceptual system is adept at exploiting sensory regularities to better extract information about our environment. One clear example of this is how the sensory and multisensory systems can utilize consistency to group sensory features into a perceptual object and segregate objects from each other and background noise. Leveraging tenets of object-based attention and multisensory binding, we sought whether this ability scaled with the strength of that consistency. We presented participants with amplitude modulated (AM) auditory and visual streams and asked them to detect imbedded orthogonal, near-threshold frequency modulation (FM) events. We modulated the correlation of the streams by varying the phase of the visual AM. In line with a previous report, we first observed peak performance that was shifted from 0°. After accounting for this, we found that across participants discriminability of the FM event linearly improved with correlation. Additionally, we sought to answer a question left dangling from our previous report as to the possible explanation for the phase shift. We found that phase shift correlated with auditory and visual response time differences, but not point of subjective simultaneity, suggesting differences in sensory processing times may account for the observed phase shift. These results suggest that our perceptual system can bind multisensory features across a spectrum of temporal correlations, a process necessary for multisensory binding in complex environments where unrelated signals may have small errant correlations.

Perception ◽  
1993 ◽  
Vol 22 (8) ◽  
pp. 963-970 ◽  
Author(s):  
Piotr Jaśkowski

Point of subjective simultaneity and simple reaction time were compared for stimuli with different rise times. It was found that these measures behave differently. To explain the result it is suggested that in the case of temporal-order judgment the subject takes into account not only the stimulus onset but also other events connected with stimulus presentation.


Perception ◽  
1991 ◽  
Vol 20 (6) ◽  
pp. 715-726 ◽  
Author(s):  
Piotr Jaśkowski

Temporal-order judgment was investigated for a pair of visual stimuli with different durations in order to check whether offset asynchrony can disturb the perception of the order/simultaneity of onset. In experiment 1 the point of subjective simultaneity was estimated by the method of adjustment. The difference in duration of the two stimuli in the pair was either 0 or 50 ms. It was found that the subject shifts the onset of the shorter stimulus towards the offset of the longer one to obtain a satisfying impression of simultaneity even though the subject was asked to ignore the events concerning the stimulus offset. In experiments 2 and 3 the method of constant stimulus was applied. Both experiments indicate that subjects, in spite of instruction, take into account the offset asynchrony in their judgment.


2012 ◽  
Vol 25 (0) ◽  
pp. 153
Author(s):  
Alex K. Malone ◽  
Nai-Yuan N. Chang ◽  
Timothy E. Hullar

Falls are one of the leading causes of disability in the elderly. Previous research has shown that falls may be related to changes in the temporal integration of multisensory stimuli. This study compared the temporal integration and processing of a vestibular and auditory stimulus in younger and older subjects. The vestibular stimulus consisted of a continuous sinusoidal rotational velocity delivered using a rotational chair and the auditory stimulus consisted of 5 ms of white noise presented dichotically through headphones (both at 0.5 Hz). Simultaneity was defined as perceiving the chair being at its furthest rightward or leftward trajectory at the same moment as the auditory stimulus was perceived in the contralateral ear. The temporal offset of the auditory stimulus was adjusted using a method of constant stimuli so that the auditory stimulus either led or lagged true simultaneity. 15 younger (ages 21–27) and 12 older (ages 63–89) healthy subjects were tested using a two alternative forced choice task to determine at what times they perceived the two stimuli as simultaneous. Younger subjects had a mean temporal binding window of 334 ± 37 ms (mean ± SEM) and a mean point of subjective simultaneity of 83 ± 15 ms. Older subjects had a mean TBW of 556 ± 36 ms and a mean point of subjective simultaneity of 158 ± 27. Both differences were significant indicating that older subjects have a wider temporal range over which they integrate vestibular and auditory stimuli than younger subjects. These findings were consistent upon retesting and were not due to differences in vestibular perception thresholds.


2015 ◽  
Vol 282 (1804) ◽  
pp. 20143083 ◽  
Author(s):  
Erik Van der Burg ◽  
Patrick T. Goodbourn

The brain is adaptive. The speed of propagation through air, and of low-level sensory processing, differs markedly between auditory and visual stimuli; yet the brain can adapt to compensate for the resulting cross-modal delays. Studies investigating temporal recalibration to audiovisual speech have used prolonged adaptation procedures, suggesting that adaptation is sluggish. Here, we show that adaptation to asynchronous audiovisual speech occurs rapidly. Participants viewed a brief clip of an actor pronouncing a single syllable. The voice was either advanced or delayed relative to the corresponding lip movements, and participants were asked to make a synchrony judgement. Although we did not use an explicit adaptation procedure, we demonstrate rapid recalibration based on a single audiovisual event. We find that the point of subjective simultaneity on each trial is highly contingent upon the modality order of the preceding trial. We find compelling evidence that rapid recalibration generalizes across different stimuli, and different actors. Finally, we demonstrate that rapid recalibration occurs even when auditory and visual events clearly belong to different actors. These results suggest that rapid temporal recalibration to audiovisual speech is primarily mediated by basic temporal factors, rather than higher-order factors such as perceived simultaneity and source identity.


2021 ◽  
Vol 15 ◽  
Author(s):  
Motohiro Kimura

When a visual object changes its position along with certain sequential regularities, the visual system rapidly and automatically forms a prediction regarding the future position of the object based on the regularities. Such prediction can drastically alter visual perception. A phenomenon called representational momentum (RM: a predictive displacement of the perceived final position of a visual object along its recent regular pattern) has provided extensive evidence for the predictive modulation of visual perception. The purpose of the present study was to identify neural effects that could explain individual differences in the strength of the predictive modulation of visual perception as measured by RM. For this purpose, in two experiments with a conventional RM paradigm where a bar was discretely presented in a regular rotation manner (with a step of 18° in Experiment 1 and a step of 20° in Experiment 2), visual evoked potentials (VEPs) in response to the regularly rotated bar were measured, and correlations between the magnitudes of RM and VEPs were examined. The results showed that the magnitudes of RM and central P2 were negatively correlated, consistently in both experiments; participants who showed a smaller central P2 tended to exhibit greater RM. Together with a previous proposal that central P2 would represent delayed reactivation of lower visual areas around the striate and prestriate cortices via reentrant feedback projections from higher areas, the present results suggest that greater suppression of delayed reactivation of lower visual areas (as indicated by smaller central P2) may underlie stronger predictive modulation of visual perception (as indicated by greater RM).


2015 ◽  
Vol 28 (3-4) ◽  
pp. 351-370 ◽  
Author(s):  
Hao Tam Ho ◽  
Hao Tam Ho ◽  
Emily Orchard-Mills ◽  
Hao Tam Ho ◽  
Emily Orchard-Mills ◽  
...  

Following prolonged exposure to audiovisual asynchrony, an observer’s point of subjective simultaneity (PSS) shifts in the direction of the leading modality. It has been debated whether other sensory pairings, such as vision and touch, lead to a similar temporal recalibration, and if so, whether the internal timing mechanism underlying lag visuotactile adaptation is centralised or distributed. To address these questions, we adapted observers to vision- and tactile-leading visuotactile asynchrony on either their left or right hand side in different blocks. In one test condition, participants performed a simultaneity judgment on the adapted side (unilateral) and in another they performed a simultaneity judgment on the non-adapted side (contralateral). In a third condition, participants adapted concurrently to equal and opposite asynchronies on each side and were tested randomly on either hand (bilateral opposed). Results from the first two conditions show that observers recalibrate to visuotactile asynchronies, and that the recalibration transfers to the non-adapted side. These findings suggest a centralised recalibration mechanism not linked to the adapted side and predict no recalibration for the bilateral opposed condition, assuming the adapted effects were equal on each side. This was confirmed in the group of participants that adapted to vision- and tactile-leading asynchrony on the right and left hand side, respectively. However, the other group (vision-leading on the left and tactile-leading on the right) did show a recalibration effect, suggesting a distributed mechanism. We discuss these findings in terms of a hybrid model that assumes the co-existence of a centralised and distributed timing mechanism.


2012 ◽  
Vol 25 (0) ◽  
pp. 14-15
Author(s):  
Alberta Ipser ◽  
Diana Paunoiu ◽  
Elliot D. Freeman

It has often been claimed that there is mutual dependence between the perceived synchrony of auditory and visual sources, and the extent to which they perceptually integrate (‘unity assumption’: Vroomen and Keetels, 2010; Welsh and Warren, 1980). However subjective audiovisual synchrony can vary widely between subjects (Stone, 2001) and between paradigms (van Eijk, 2008). Do such individual differences in subjective synchrony correlate positively with individual differences in optimal timing for integration, as expected under the unity assumption? In separate experiments we measured the optimal audiovisual asynchrony for the McGurk illusion (McGurk and MacDonald, 1976), and the stream-bounce illusion (Sekuler et al., 1997). We concurrently elicited either temporal order judgements (TOJ) or simultaneity judgements (SJ), in counterbalanced sessions, from which we derived the point of subjective simultaneity (PSS). For both experiments, the asynchrony for maximum illusion showed a significant positive correlation with PSS derived from SJ, following the unity assumption. But surprisingly, the analogous correlation with PSS derived from TOJ was significantly negative. The temporal mechanisms for this pairing of tasks seem neither unitary nor fully independent, but apparently antagonistic. A tentative temporal renormalisation mechanism explains these paradoxical results as follows: (1) subjective timing in our different tasks can depend on independent mechanisms subject to their own neural delays; (2) inter-modal synchronization is achieved by first discounting the mean neural delay within each modality; and (3) apparent antagonism between estimates of subjective timing emerges as the mean is attracted towards deviants in the unimodal temporal distribution.


2016 ◽  
Vol 7 ◽  
Author(s):  
Kielan Yarrow ◽  
Sian E. Martin ◽  
Steven Di Costa ◽  
Joshua A. Solomon ◽  
Derek H. Arnold

Sign in / Sign up

Export Citation Format

Share Document