scholarly journals Multisensory Integration in the Virtual Hand Illusion with Active Movement

2016 ◽  
Vol 2016 ◽  
pp. 1-9 ◽  
Author(s):  
Woong Choi ◽  
Liang Li ◽  
Satoru Satoh ◽  
Kozaburo Hachimura

Improving the sense of immersion is one of the core issues in virtual reality. Perceptual illusions of ownership can be perceived over a virtual body in a multisensory virtual reality environment. Rubber Hand and Virtual Hand Illusions showed that body ownership can be manipulated by applying suitable visual and tactile stimulation. In this study, we investigate the effects of multisensory integration in the Virtual Hand Illusion with active movement. A virtual xylophone playing system which can interactively provide synchronous visual, tactile, and auditory stimulation was constructed. We conducted two experiments regarding different movement conditions and different sensory stimulations. Our results demonstrate that multisensory integration with free active movement can improve the sense of immersion in virtual reality.

2019 ◽  
Vol 31 (4) ◽  
pp. 592-606 ◽  
Author(s):  
Laura Crucianelli ◽  
Yannis Paloyelis ◽  
Lucia Ricciardi ◽  
Paul M. Jenkinson ◽  
Aikaterini Fotopoulou

Multisensory integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size–weight illusion (SWI), allow us to investigate how the brain resolves conflicting multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant's hidden hand and a visible rubber hand creates illusory body ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting multisensory input. Recent hypotheses propose that the precision of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine, and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied multisensory integration. The present, double-blind, placebo-controlled, crossover study ( n = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on multisensory integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased an embodied version of the SWI (quantified as estimation error during a weight estimation task). These findings suggest that oxytocin might modulate processes of visuotactile multisensory integration by increasing the precision of top–down signals against bottom–up sensory input.


2019 ◽  
Vol 44 (3) ◽  
pp. 177-184 ◽  
Author(s):  
Merel Prikken ◽  
Anouk van der Weiden ◽  
Heleen Baalbergen ◽  
Manon H.J. Hillegers ◽  
René S. Kahn ◽  
...  

2018 ◽  
Author(s):  
Laura Crucianelli ◽  
Yannis Paloyelis ◽  
Lucia Ricciardi ◽  
Paul M Jenkinson ◽  
Aikaterini Fotopoulou

AbstractMultisensory integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size-weight illusion (SWI), allow us to investigate how the brain resolves conflicting multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant’s hidden hand and a visible rubber hand creates illusory bodily ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting multisensory input. Recent hypotheses propose that the precision or salience of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied multisensory integration. The present, double-blind, placebo-controlled, crossed-over study (N = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on multisensory integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased the embodied version of the SWI (quantified as weight estimation error). These findings suggest that oxytocin might modulate processes of visuo-tactile multisensory integration by increasing the precision of top-down signals against bottom-up sensory input.


2018 ◽  
Author(s):  
Maria Laura Filippetti ◽  
Louise P. Kirsch ◽  
Laura Crucianelli ◽  
Aikaterini Fotopoulou

AbstractOur sense of body ownership relies on integrating different sensations according to their temporal and spatial congruency. Nevertheless, there is ongoing controversy about the role of affective congruency during multisensory integration, i.e. whether the stimuli to be perceived by the different sensory channels are congruent or incongruent in terms of their affective quality. In the present study, we applied a widely used multisensory integration paradigm, the Rubber Hand Illusion, to investigate the role of affective, top-down aspects of sensory congruency between visual and tactile modalities in the sense of body ownership. In Experiment 1 (N = 36), we touched participants with either soft or rough fabrics in their unseen hand, while they watched a rubber hand been touched synchronously with the same fabric or with a ‘hidden’ fabric of ‘uncertain roughness’. In Experiment 2 (N = 50), we used the same paradigm as in Experiment 1, but replaced the ‘uncertainty’ condition with an ‘incongruent’ one, in which participants saw the rubber hand being touched with a fabric of incongruent roughness and hence opposite valence. We found that certainty (Experiment 1) and congruency (Experiment 2) between the felt and vicariously perceived tactile affectivity led to higher subjective embodiment compared to uncertainty and incongruency, respectively, irrespective of any valence effect. Our results suggest that congruency in the affective top-down aspects of sensory stimulation is important to the multisensory integration process leading to embodiment, over and above temporal and spatial properties.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Noriaki Kanayama ◽  
Masayuki Hara ◽  
Kenta Kimura

AbstractVirtual reality (VR) enables the fast, free, and highly controllable setting of experimental body images. Illusions pertaining to a body, such as the rubber hand illusion (RHI), can be easily conducted in VR settings, and some phenomena, such as full-body illusions, are only realized in virtual environments. However, the multisensory-integration process in VR is not yet fully understood. Thus, it remains to be clarified if specific phenomena that occur under VR settings manifest in real life as well. One useful investigative approach is measuring brain activities during a psychological experiment. Electroencephalography (EEG) oscillatory activities provide insight into the human multisensory integration process. Nevertheless, EEG data can be vulnerable to VR noise, which causes measurement and analytical difficulties for EEG data recorded in VR environments. Here, we achieve an experimental RHI setting using a head-mounted display that provides a VR visual space and VR dummy hand along with EEG measurements. We compared EEG data collected in both real and VR environments and observed the gamma and theta band oscillatory activities. Ultimately, we observed statistically significant differences between congruent (RHI) and incongruent (not RHI) conditions in the real environment, which is consistent with previous studies. Differences in the VR condition were observed only on the late theta band oscillation, suggesting that the VR setting itself altered the perceptual and sensory integration mechanisms. Thus, we must model this difference between real and VR settings whenever we use VR to investigate our bodily self-perception.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Michel Akselrod ◽  
Bogdan Vigaru ◽  
Julio Duenas ◽  
Roberto Martuzzi ◽  
James Sulzer ◽  
...  

AbstractWhen performing willed actions, we have the unified and coherent experience of owning and controlling our body. Body ownership is believed to emerge from the integration of coherent multisensory signals, while agency is believed to emerge from the coherence between predicted and perceived outcomes of actions. As a consequence, body ownership and agency can both be modulated by multisensory conflicts. The contribution of active movement generation to ownership and agency has not been parametrically explored. Here, we investigated the contribution of interaction force between the agent and the environment to the sense of hand ownership (SO) and the sense of hand agency (SA). By combining robotics and virtual reality, we manipulated the sensorimotor and visual information during immersive scenarios to induce and quantify altered states of SO and SA. First, we demonstrated that SO and SA could be successfully manipulated by our experimental paradigms. Second, we showed that interaction force strongly contributes to SA, but to a lesser extent to SO. Finally, we showed that SO and SA interact beyond their common multisensory basis. Our results, based on two independent studies, provide a direct link between sensorimotor interactions and subjective body experience and demonstrate a new dissociation between SO and SA.


2017 ◽  
Vol 30 (7-8) ◽  
pp. 615-637 ◽  
Author(s):  
Olga Perepelkina ◽  
Maria Boboleva ◽  
Galina Arina ◽  
Valentina Nikolaeva

The aim of the study was to investigate how emotion information processing factors, such as alexithymia and emotional intelligence, modulate body ownership and influence multisensory integration during the ‘rubber hand illusion’ (RHI) task. It was previously shown that alexithymia correlates with RHI, and we suggested that emotional intelligence should also be a top-down factor of body ownership, since it was not shown in previous experiments. We elaborated the study of Grynberg and Pollatos [Front. Hum. Neurosci.9(2015) 357] with an additional measure of emotional intelligence, and propose an explanation for the interrelation of emotion and body ownership processing. Eighty subjects took part in the RHI experiment and completed the Toronto Alexithymia Scale and the Mayer–Salovey–Caruso Emotional Intelligence Test (MSCEIT). Only MSCEIT was detected to be a significant predictor of the subjective measure of the RHI. There were no significant correlations between alexithymia scores and the test statements of the RHI or the proprioceptive drift, thus we did not replicate the results of Grynberg and Pollatos. However, alexithymia correlated with the control statements of subjective reports of the illusion, which might be explained as a disruption of the ability to discriminate and describe bodily experience. Therefore, (1) alexithymia seems to be connected with difficulties in conscious or verbal processing of body-related information, and (2) higher emotional intelligence might improve multisensory integration of body-related signals and reflect better predictive models of self-processing.


2022 ◽  
Vol 19 (1) ◽  
pp. 1-19
Author(s):  
Anca Salagean ◽  
Jacob Hadnett-Hunter ◽  
Daniel J. Finnegan ◽  
Alexandra A. De Sousa ◽  
Michael J. Proulx

Ultrasonic mid-air haptic technologies, which provide haptic feedback through airwaves produced using ultrasound, could be employed to investigate the sense of body ownership and immersion in virtual reality (VR) by inducing the virtual hand illusion (VHI). Ultrasonic mid-air haptic perception has solely been investigated for glabrous (hairless) skin, which has higher tactile sensitivity than hairy skin. In contrast, the VHI paradigm typically targets hairy skin without comparisons to glabrous skin. The aim of this article was to investigate illusory body ownership, the applicability of ultrasonic mid-air haptics, and perceived immersion in VR using the VHI. Fifty participants viewed a virtual hand being stroked by a feather synchronously and asynchronously with the ultrasonic stimulation applied to the glabrous skin on the palmar surface and the hairy skin on the dorsal surface of their hands. Questionnaire responses revealed that synchronous stimulation induced a stronger VHI than asynchronous stimulation. In synchronous conditions, the VHI was stronger for palmar stimulation than dorsal stimulation. The ultrasonic stimulation was also perceived as more intense on the palmar surface compared to the dorsal surface. Perceived immersion was not related to illusory body ownership per se but was enhanced by the provision of synchronous stimulation.


2019 ◽  
Author(s):  
Marte Roel Lesur ◽  
Marieke Lieve Weijs ◽  
Colin Simon ◽  
Oliver Alan Kannape ◽  
Bigna Lenggenhager

AbstractThe loss of body ownership, the feeling that your body and its limbs no longer belong to you, presents a severe clinical condition that has proven difficult to study directly. We here propose a novel paradigm using mixed reality to interfere with natural embodiment using temporally conflicting sensory signals from the own hand. In Experiment 1 we investigated how such a mismatch affects phenomenological and physiological aspects of embodiment, and identified its most important dimensions using a principle component analysis. The results suggest that such a mismatch induces a strong reduction in embodiment accompanied by an increase in feelings of disownership and deafference, which was, however, not reflected in physiological changes. In Experiment 2 we refined the paradigm to measure perceptual thresholds for temporal mismatches and compared how different multimodal, mismatching information alters the sense of embodiment. The results showed that while visual delay decreased embodiment both while actively moving and during passive touch, the effect was stronger for the former. Our results extend previous findings as they demonstrate that a sense of disembodiment can be induced through controlled multimodal mismatches about one’s own body and more so during active movement as compared to passive touch. Based on the ecologically more valid protocol we propose here, we argue that such a sense of disembodiment may fundamentally differ from disownership sensations as discussed in the rubber hand illusion literature, and emphasize its clinical relevance. This might importantly advance the current debate on the relative contribution of different modalities to our sense of body and its plasticity.


2013 ◽  
Vol 10 (85) ◽  
pp. 20130300 ◽  
Author(s):  
Joan Llobera ◽  
M. V. Sanchez-Vives ◽  
Mel Slater

In the rubber hand illusion, tactile stimulation seen on a rubber hand, that is synchronous with tactile stimulation felt on the hidden real hand, can lead to an illusion of ownership over the rubber hand. This illusion has been shown to produce a temperature decrease in the hidden hand, suggesting that such illusory ownership produces disownership of the real hand. Here, we apply immersive virtual reality (VR) to experimentally investigate this with respect to sensitivity to temperature change. Forty participants experienced immersion in a VR with a virtual body (VB) seen from a first-person perspective. For half the participants, the VB was consistent in posture and movement with their own body, and in the other half, there was inconsistency. Temperature sensitivity on the palm of the hand was measured before and during the virtual experience. The results show that temperature sensitivity decreased in the consistent compared with the inconsistent condition. Moreover, the change in sensitivity was significantly correlated with the subjective illusion of virtual arm ownership but modulated by the illusion of ownership over the full VB. This suggests that a full body ownership illusion results in a unification of the virtual and real bodies into one overall entity—with proprioception and tactile sensations on the real body integrated with the visual presence of the VB. The results are interpreted in the framework of a ‘body matrix’ recently introduced into the literature.


Sign in / Sign up

Export Citation Format

Share Document