Embodiment and Multisensory Perception of Synchronicity: Biological Features Modulate Visual and Tactile Multisensory Interaction in Simultaneity Judgements

2021 ◽  
pp. 1-18
Author(s):  
Ramiro Joly-Mascheroni ◽  
Sonia Abad-Hernando ◽  
Bettina Forster ◽  
Beatriz Calvo-Merino

Abstract The concept of embodiment has been used in multiple scenarios, but in cognitive neuroscience it normally refers to the comprehension of the role of one’s own body in the cognition of everyday situations and the processes involved in that perception. Multisensory research is gradually embracing the concept of embodiment, but the focus has mostly been concentrated upon audiovisual integration. In two experiments, we evaluated how the likelihood of a perceived stimulus to be embodied modulates visuotactile interaction in a Simultaneity Judgement task. Experiment 1 compared the perception of two visual stimuli with and without biological attributes (hands and geometrical shapes) moving towards each other, while tactile stimuli were provided on the palm of the participants’ hand. Participants judged whether the meeting point of two periodically-moving visual stimuli was synchronous with the tactile stimulation in their own hands. Results showed that in the hand condition, the Point of Subjective Simultaneity (PSS) was significantly more distant to real synchrony (60 ms after the Stimulus Onset Asynchrony, SOA) than in the geometrical shape condition (45 ms after SOA). In experiment 2, we further explored the impact of biological attributes by comparing performance on two visual biological stimuli (hands and ears), that also vary in their motor and visuotactile properties. Results showed that the PSS was equally distant to real synchrony in both the hands and ears conditions. Overall, findings suggest that embodied visual biological stimuli may modulate visual and tactile multisensory interaction in simultaneity judgements.

Perception ◽  
2016 ◽  
Vol 46 (2) ◽  
pp. 205-218 ◽  
Author(s):  
Yanna Ren ◽  
Weiping Yang ◽  
Kohei Nakahashi ◽  
Satoshi Takahashi ◽  
Jinglong Wu

Although neuronal studies have shown that audiovisual integration is regulated by temporal factors, there is still little knowledge about the impact of temporal factors on audiovisual integration in older adults. To clarify how stimulus onset asynchrony (SOA) between auditory and visual stimuli modulates age-related audiovisual integration, 20 younger adults (21–24 years) and 20 older adults (61–80 years) were instructed to perform an auditory or visual stimuli discrimination experiment. The results showed that in younger adults, audiovisual integration was altered from an enhancement (AV, A ± 50 V) to a depression (A ± 150 V). In older adults, the alterative pattern was similar to that for younger adults with the expansion of SOA; however, older adults showed significantly delayed onset for the time-window-of-integration and peak latency in all conditions, which further demonstrated that audiovisual integration was delayed more severely with the expansion of SOA, especially in the peak latency for V-preceded-A conditions in older adults. Our study suggested that audiovisual facilitative integration occurs only within a certain SOA range (e.g., −50 to 50 ms) in both younger and older adults. Moreover, our results confirm that the response for older adults was slowed and provided empirical evidence that integration ability is much more sensitive to the temporal alignment of audiovisual stimuli in older adults.


2021 ◽  
Author(s):  
Niall Gavin ◽  
David McGovern ◽  
Rebecca Hirst

The sound-induced flash illusion occurs when a rapidly presented visual stimulus is accompanied by two auditory stimuli, creating the illusory percept of two visual stimuli. While much research has focused on how the temporal proximity of the audiovisual stimuli impacts susceptibility to the illusion, comparatively less research has been dedicated to investigating the impact of spatial manipulations. Here, we aimed to assess whether manipulating the eccentricity of visual flash stimuli altered the properties of the temporal binding window associated with the SIFI. Twenty participants were required to report whether they perceived one or two flashes that were concurrently presented with one or two beeps. Visual stimuli were presented at one of four different retinal eccentricities (2.5, 5, 7.5 or 10 degrees below fixation) and audiovisual stimuli were separated by one of eight stimulus-onset asynchronies. In keeping with previous findings, increasing stimulus-onset asynchrony between the auditory and visual stimuli led to a marked decrease in susceptibility to the illusion allowing us to estimate the width and amplitude of the temporal binding window. However, varying the eccentricity of the visual stimulus had no effect on either the width or the peak amplitude of the temporal binding window, with a similar pattern of results observed for both the “fission” and “fusion” variants of the illusion. Thus, spatial manipulations of the audiovisual stimuli used to elicit the SIFI appear to have a weaker effect on the integration of sensory signals than temporal manipulations, a finding which has implications for neuroanatomical models of multisensory integration.


2015 ◽  
Vol 29 (4) ◽  
pp. 135-146 ◽  
Author(s):  
Miroslaw Wyczesany ◽  
Szczepan J. Grzybowski ◽  
Jan Kaiser

Abstract. In the study, the neural basis of emotional reactivity was investigated. Reactivity was operationalized as the impact of emotional pictures on the self-reported ongoing affective state. It was used to divide the subjects into high- and low-responders groups. Independent sources of brain activity were identified, localized with the DIPFIT method, and clustered across subjects to analyse the visual evoked potentials to affective pictures. Four of the identified clusters revealed effects of reactivity. The earliest two started about 120 ms from the stimulus onset and were located in the occipital lobe and the right temporoparietal junction. Another two with a latency of 200 ms were found in the orbitofrontal and the right dorsolateral cortices. Additionally, differences in pre-stimulus alpha level over the visual cortex were observed between the groups. The attentional modulation of perceptual processes is proposed as an early source of emotional reactivity, which forms an automatic mechanism of affective control. The role of top-down processes in affective appraisal and, finally, the experience of ongoing emotional states is also discussed.


Author(s):  
Adnan Abdulhamid Saati

This research aims at exposing the impact of the variability of presentation ways of visual stimuli and their associated sign-language explanation(visual stimuli without sign-language explanation/ visual stimuli followed by sign-language explanation/ visual stimuli simultaneous with the presentation of sign-language explanation) in educational computer programs on academic achievement of some English words among high school students (deaf group) in the integration program At Ain Jaloot Secondary School and the integration program in Dumah Al Jandal Secondary School. The study population included students of the integration program of the two schools, the sample size was determined and it included (36) deaf students who were randomly distributed into three pilot groups. The prior assessment was applied by using the electronic achievement test prepared by the Quiz Creator application, its reliability and validity were then confirmed by checking the coherence of the three groups. The three pilot groups enrolled for an educational computer program, in which the first group studied the impact of the variability of visual stimuli without sign-language explanation, the second group studied the visual stimuli followed by sign-language explanation, then the third group studied the visual stimuli simultaneous with the presentation of sign-language explanation the groups and each group of the three groups included a sample of 12 deaf students. The results of the study showed: Presence of differences which are statically significant (P value= 0.05) between the average degrees of the three groups in favor of the second group who studied the visual stimuli followed by a sign-language explanation.


Perception ◽  
10.1068/p5844 ◽  
2007 ◽  
Vol 36 (10) ◽  
pp. 1455-1464 ◽  
Author(s):  
Vanessa Harrar ◽  
Laurence R Harris

Gestalt rules that describe how visual stimuli are grouped also apply to sounds, but it is unknown if the Gestalt rules also apply to tactile or uniquely multimodal stimuli. To investigate these rules, we used lights, touches, and a combination of lights and touches, arranged in a classic Ternus configuration. Three stimuli (A, B, C) were arranged in a row across three fingers. A and B were presented for 50 ms and, after a delay, B and C were presented for 50 ms. Subjects were asked whether they perceived AB moving to BC (group motion) or A moving to C (element motion). For all three types of stimuli, at short delays, A to C dominated, while at longer delays AB to BC dominated. The critical delay, where perception changed from group to element motion, was significantly different for the visual Ternus (3 lights, 162 ms) and the tactile Ternus (3 touches, 195 ms). The critical delay for the multimodal Ternus (3 light – touch pairs, 161 ms) was not different from the visual or tactile Ternus effects. In a second experiment, subjects were exposed to 2.5 min of visual group motion (stimulus onset asynchrony = 300 ms). The exposure caused a shift in the critical delay of the visual Ternus, a trend in the same direction for the multimodal Ternus, but no shift in the tactile Ternus. These results suggest separate but similar grouping rules for visual, tactile, and multimodal stimuli.


Perception ◽  
1993 ◽  
Vol 22 (8) ◽  
pp. 963-970 ◽  
Author(s):  
Piotr Jaśkowski

Point of subjective simultaneity and simple reaction time were compared for stimuli with different rise times. It was found that these measures behave differently. To explain the result it is suggested that in the case of temporal-order judgment the subject takes into account not only the stimulus onset but also other events connected with stimulus presentation.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Nienke B. Debats ◽  
Herbert Heuer ◽  
Christoph Kayser

AbstractTo organize the plethora of sensory signals from our environment into a coherent percept, our brain relies on the processes of multisensory integration and sensory recalibration. We here asked how visuo-proprioceptive integration and recalibration are shaped by the presence of more than one visual stimulus, hence paving the way to study multisensory perception under more naturalistic settings with multiple signals per sensory modality. We used a cursor-control task in which proprioceptive information on the endpoint of a reaching movement was complemented by two visual stimuli providing additional information on the movement endpoint. The visual stimuli were briefly shown, one synchronously with the hand reaching the movement endpoint, the other delayed. In Experiment 1, the judgments of hand movement endpoint revealed integration and recalibration biases oriented towards the position of the synchronous stimulus and away from the delayed one. In Experiment 2 we contrasted two alternative accounts: that only the temporally more proximal visual stimulus enters integration similar to a winner-takes-all process, or that the influences of both stimuli superpose. The proprioceptive biases revealed that integration—and likely also recalibration—are shaped by the superposed contributions of multiple stimuli rather than by only the most powerful individual one.


2019 ◽  
Vol 116 (32) ◽  
pp. 16056-16061 ◽  
Author(s):  
Elie Rassi ◽  
Andreas Wutz ◽  
Nadia Müller-Voggel ◽  
Nathan Weisz

Ongoing fluctuations in neural excitability and in networkwide activity patterns before stimulus onset have been proposed to underlie variability in near-threshold stimulus detection paradigms—that is, whether or not an object is perceived. Here, we investigated the impact of prestimulus neural fluctuations on the content of perception—that is, whether one or another object is perceived. We recorded neural activity with magnetoencephalography (MEG) before and while participants briefly viewed an ambiguous image, the Rubin face/vase illusion, and required them to report their perceived interpretation in each trial. Using multivariate pattern analysis, we showed robust decoding of the perceptual report during the poststimulus period. Applying source localization to the classifier weights suggested early recruitment of primary visual cortex (V1) and ∼160-ms recruitment of the category-sensitive fusiform face area (FFA). These poststimulus effects were accompanied by stronger oscillatory power in the gamma frequency band for face vs. vase reports. In prestimulus intervals, we found no differences in oscillatory power between face vs. vase reports in V1 or in FFA, indicating similar levels of neural excitability. Despite this, we found stronger connectivity between V1 and FFA before face reports for low-frequency oscillations. Specifically, the strength of prestimulus feedback connectivity (i.e., Granger causality) from FFA to V1 predicted not only the category of the upcoming percept but also the strength of poststimulus neural activity associated with the percept. Our work shows that prestimulus network states can help shape future processing in category-sensitive brain regions and in this way bias the content of visual experiences.


2019 ◽  
Vol 121 (6) ◽  
pp. 2202-2214 ◽  
Author(s):  
John P. McClure ◽  
Pierre-Olivier Polack

Multimodal sensory integration facilitates the generation of a unified and coherent perception of the environment. It is now well established that unimodal sensory perceptions, such as vision, are improved in multisensory contexts. Whereas multimodal integration is primarily performed by dedicated multisensory brain regions such as the association cortices or the superior colliculus, recent studies have shown that multisensory interactions also occur in primary sensory cortices. In particular, sounds were shown to modulate the responses of neurons located in layers 2/3 (L2/3) of the mouse primary visual cortex (V1). Yet, the net effect of sound modulation at the V1 population level remained unclear. In the present study, we performed two-photon calcium imaging in awake mice to compare the representation of the orientation and the direction of drifting gratings by V1 L2/3 neurons in unimodal (visual only) or multimodal (audiovisual) conditions. We found that sound modulation depended on the tuning properties (orientation and direction selectivity) and response amplitudes of V1 L2/3 neurons. Sounds potentiated the responses of neurons that were highly tuned to the cue’s orientation and direction but weakly active in the unimodal context, following the principle of inverse effectiveness of multimodal integration. Moreover, sound suppressed the responses of neurons untuned for the orientation and/or the direction of the visual cue. Altogether, sound modulation improved the representation of the orientation and direction of the visual stimulus in V1 L2/3. Namely, visual stimuli presented with auditory stimuli recruited a neuronal population better tuned to the visual stimulus orientation and direction than when presented alone. NEW & NOTEWORTHY The primary visual cortex (V1) receives direct inputs from the primary auditory cortex. Yet, the impact of sounds on visual processing in V1 remains controverted. We show that the modulation by pure tones of V1 visual responses depends on the orientation selectivity, direction selectivity, and response amplitudes of V1 neurons. Hence, audiovisual stimuli recruit a population of V1 neurons better tuned to the orientation and direction of the visual stimulus than unimodal visual stimuli.


2020 ◽  
pp. 1-23
Author(s):  
Makoto Wada ◽  
Hanako Ikeda ◽  
Shinichiro Kumagaya

Abstract Visual distractors interfere with tactile temporal order judgment (TOJ) at moderately short stimulus onset asynchronies (SOAs) in typically developing participants. Presentation of a rubber hand in a forward direction to the participant’s hand enhances this effect, while that in an inverted direction weakens the effect. Individuals with autism spectrum disorder (ASD) have atypical multisensory processing; however, effects of interferences on atypical multisensory processing in ASD remain unclear. In this study, we examined the effects of visual interference on tactile TOJ in individuals with ASD. Two successive tactile stimuli were delivered to the index and ring fingers of a participant’s right hand in an opaque box. A rubber hand was placed on the box in a forward or inverted direction. Concurrently, visual stimuli provided by light-emitting diodes on the fingers of the rubber hand were delivered in a congruent or incongruent order. Participants were required to judge the temporal order of the tactile stimuli regardless of visual distractors. In the absence of a visual stimulus, participants with ASD tended to judge the simultaneous stimuli as the ring finger being stimulated first during tactile TOJ compared with typically developing (TD) controls, and congruent visual stimuli eliminated the bias. When incongruent visual stimuli were delivered, judgment was notably reversed in participants with ASD, regardless of the direction of the rubber hand. The findings demonstrate that there are considerable effects of visual interferences on tactile TOJ in individuals with ASD.


Sign in / Sign up

Export Citation Format

Share Document