Experience-Dependent Modulation of Rubber Hand Illusion in Badminton Players

Author(s):  
Masanori Sakamoto ◽  
Hirotoshi Ifuku

Badminton players have a plastic modification of their arm representation in the brain due to the prolonged use of their racket. However, it is not known whether their arm representation can be altered through short-term visuotactile integration. The neural representation of the body is easily altered when multiple sensory signals are integrated in the brain. One of the most popular experimental paradigms for investigating this phenomenon is the “rubber hand illusion.” This study was designed to investigate the effect of prolonged use of a racket on the modulation of arm representation during the rubber hand illusion in badminton players. When badminton players hold the racket, their badminton experience in years is negatively correlated with the magnitude of the rubber hand illusion. This finding suggests that tool embodiment obtained by the prolonged use of the badminton racket is less likely to be disturbed when holding the racket.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Masanori Sakamoto ◽  
Hirotoshi Ifuku

AbstractThe neural representation of the body is easily altered by the integration of multiple sensory signals in the brain. The “rubber hand illusion” (RHI) is one of the most popular experimental paradigms to investigate this phenomenon. During this illusion, a feeling of ownership of the rubber hand is created. Some studies have shown that somatosensory processing in the brain is attenuated when RHI occurs. However, it is unknown where attenuation of somatosensory processing occurs. Here, we show that somatosensory processing is attenuated in the primary somatosensory cortex. We found that the earliest response of somatosensory evoked potentials, which is thought to originate from the primary somatosensory cortex, was attenuated during RHI. Furthermore, this attenuation was observed before the occurrence of the illusion. Our results suggest that attenuation of sensory processing in the primary somatosensory cortex is one of the factors influencing the occurrence of the RHI.


2021 ◽  
Author(s):  
Masanori Sakamoto ◽  
Hirotoshi Ifuku

Abstract The neural representation of the body is easily altered by the integration of multiple sensory signals in the brain. The “rubber hand illusion” (RHI) is one of the most popular experimental paradigms to investigate this phenomenon. During this illusion, ownership of the rubber hand is created. Some studies have shown that somatosensory processing in the brain is attenuated when RHI occurs. However, it is unknown where attenuation of somatosensory inputs occurs. Here, we show that somatosensory input from the hand is attenuated at the primary somatosensory cortex. We found that the early response of somatosensory evoked potential, which is thought to originate from the primary somatosensory cortex, was attenuated during RHI. Furthermore, this attenuation was observed before the occurrence of the illusion. Our results suggest that attenuation of somatosensory inputs from the hand to the brain is one of the factors influencing the occurrence of the RHI.


Psihologija ◽  
2022 ◽  
pp. 2-2
Author(s):  
Aitao Lu ◽  
Xuebin Wang ◽  
Xiuxiu Hong ◽  
Tianhua Song ◽  
Meifang Zhang ◽  
...  

Many studies have reported that bottom-up multisensory integration of visual, tactile, and proprioceptive information can distort our sense of body-ownership, producing rubber hand illusion (RHI). There is less evidence about when and how the body-ownership is distorted in the brain during RHI. To examine whether this illusion effect occurs preattentively at an early stage of processing, we monitored the visual mismatch negativity (vMMN) component (the index of automatic deviant detection) and N2 (the index for conflict monitoring). Participants first performed an RHI elicitation task in a synchronous or asynchronous setting and then finished a passive visual oddball task in which the deviant stimuli were unrelated to the explicit task. A significant interaction between Deviancy (deviant hand vs. standard hand) and Group (synchronous vs. asynchronous) was found. The asynchronous group showed clear mismatch effects in both vMMN and N2, while the synchronous group had such effect only in N2. The results indicate that after the elicitation of RHI bottom-up integration could be retrieved at the early stage of sensory processing before top-down processing, providing evidence for the priority of the bottom-up processes after the generation of RHI and revealing the mechanism of how the body-ownership is unconsciously distorted in the brain.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Zakaria Djebbara ◽  
Lars Brorson Fich ◽  
Klaus Gramann

AbstractAction is a medium of collecting sensory information about the environment, which in turn is shaped by architectural affordances. Affordances characterize the fit between the physical structure of the body and capacities for movement and interaction with the environment, thus relying on sensorimotor processes associated with exploring the surroundings. Central to sensorimotor brain dynamics, the attentional mechanisms directing the gating function of sensory signals share neuronal resources with motor-related processes necessary to inferring the external causes of sensory signals. Such a predictive coding approach suggests that sensorimotor dynamics are sensitive to architectural affordances that support or suppress specific kinds of actions for an individual. However, how architectural affordances relate to the attentional mechanisms underlying the gating function for sensory signals remains unknown. Here we demonstrate that event-related desynchronization of alpha-band oscillations in parieto-occipital and medio-temporal regions covary with the architectural affordances. Source-level time–frequency analysis of data recorded in a motor-priming Mobile Brain/Body Imaging experiment revealed strong event-related desynchronization of the alpha band to originate from the posterior cingulate complex, the parahippocampal region as well as the occipital cortex. Our results firstly contribute to the understanding of how the brain resolves architectural affordances relevant to behaviour. Second, our results indicate that the alpha-band originating from the occipital cortex and parahippocampal region covaries with the architectural affordances before participants interact with the environment, whereas during the interaction, the posterior cingulate cortex and motor areas dynamically reflect the affordable behaviour. We conclude that the sensorimotor dynamics reflect behaviour-relevant features in the designed environment.


2021 ◽  
pp. 133-151 ◽  
Author(s):  
Noriaki Kanayama ◽  
Kentaro Hiromitsu

Is the body reducible to neural representation in the brain? There is some evidence that the brain contributes to the functioning of the body from neuroimaging, neurophysiological, and lesion studies. Well-known dyadic taxonomy of the body schema and the body image (hereafter BSBI) is based primarily on the evidence in brain-damaged patients. Although there is a growing consensus that the BSBI exists, there is little agreement on the dyadic taxonomy because it is not a concrete and common concept across various research fields. This chapter tries to investigate the body representation in the cortex and nervous system in terms of sensory modality and psychological function using two different approaches. The first approach is to review the neurological evidence and cortical area which is related to body representation, regardless of the BSBI, and then to reconsider how we postulate the BSBI in our brain. It can be considered that our body representation could be constructed by the whole of the neural system, including the cortex and peripheral nerves. The second approach is to revisit the BSBI conception from the viewpoint of recent neuropsychology and propose three types of body representation: body schema, body structural description, and body semantics. This triadic taxonomy is considered consistent with the cortical networks based on the evidence of bodily disorders due to brain lesions. These two approaches allow to reconsider the BSBI more carefully and deeply and to give us the possibility that the body representation could be underpinned with the network in the brain.


2018 ◽  
Vol 31 (6) ◽  
pp. 537-555 ◽  
Author(s):  
Jennifer L. Campos ◽  
Graziella El-Khechen Richandi ◽  
Babak Taati ◽  
Behrang Keshavarz

Percepts about our body’s position in space and about body ownership are informed by multisensory feedback from visual, proprioceptive, and tactile inputs. The Rubber Hand Illusion (RHI) is a multisensory illusion that is induced when an observer sees a rubber hand being stroked while they feel their own, spatially displaced, and obstructed hand being stroked. When temporally synchronous, the visual–tactile interactions can create the illusion that the rubber hand belongs to the observer and that the observer’s real hand is shifted in position towards the rubber hand. Importantly, little is understood about whether these multisensory perceptions of the body change with older age. Thus, in this study we implemented a classic RHI protocol (synchronous versus asynchronous stroking) with healthy younger (18–35) and older (65+) adults and measured the magnitude of proprioceptive drift and the subjective experience of body ownership. As an adjunctive objective measure, skin temperature was recorded to evaluate whether decreases in skin temperature were associated with illusory percepts, as has been shown previously. The RHI was observed for both age groups with respect to increased drift and higher ratings of ownership following synchronous compared to asynchronous stroking. Importantly, no effects of age and no interactions between age and condition were observed for either of these outcome measures. No effects were observed for skin temperature. Overall, these results contribute to an emerging field of research investigating the conditions under which age-related differences in multisensory integration are observed by providing insights into the role of visual, proprioceptive, and tactile inputs on bodily percepts.


2019 ◽  
Author(s):  
Klaudia Grechuta ◽  
Javier De La Torre ◽  
Belén Rubio Ballester ◽  
Paul F.M.J. Verschure

AbstractThe unique ability to identify one’s own body and experience it as one’s own is fundamental in goal-oriented behavior and survival. However, the mechanisms underlying the so-called body ownership are yet not fully understood. The plasticity of body ownership has been studied using two experimental methods or their variations. Specifically, the Rubber Hand Illusion (RHI), where the tactile stimuli are externally generated, or the moving RHI which implies self-initiated movements. Grounded in these paradigms, evidence has demonstrated that body ownership is a product of bottom-up reception of self- and externally-generated multisensory information and top-down comparison between the predicted and the actual sensory stimuli. Crucially, provided the design of the current paradigms, where one of the manipulated cues always involves the processing of a proximal modality sensing the body or its surface (e.g., touch), the contribution of sensory signals which pertain to the environment remain elusive. Here we propose that, as any robust percept, body ownership depends on the integration and prediction of all the sensory stimuli, and therefore it will depend on the consistency of purely distal sensory signals pertaining to the environment. To test our hypothesis, we create an embodied goal-oriented task and manipulate the predictability of the surrounding environment by changing the congruency of purely distal multisensory cues while preserving bodily and action-driven signals entirely predictable. Our results empirically reveal that the way we represent our body is contingent upon all the sensory stimuli including purely distal and action-independent signals which pertain to the environment.


2018 ◽  
Author(s):  
Piotr Litwin

Human body sense is surprisingly flexible – precisely administered multisensory stimulation may result in the illusion that an external object is part of one’s body. There seems to be a general consensus that there are certain top-down constraints on which objects may be incorporated: in particular, to-be-embodied objects should be structurally similar to a visual representation stored in an internal body model for a shift in one’s body image to occur. However, empirical evidence contradicts the body model hypothesis: the sense of ownership may be spread over objects strikingly distinct in morphology and structure (e.g., robotic arms or empty space) and direct empirical support for the theory is currently lacking. As an alternative, based on the example of the rubber hand illusion (RHI), I propose a multisensory integration account of how the sense of ownership is induced. In this account, the perception of one’s own body is a regular type of multisensory perception and multisensory integration processes are not only necessary but also sufficient for embodiment. In this paper, I propose how RHI can be modeled with the use of Maximum Likelihood Estimation and natural correlation rules. I also discuss how Bayesian Coupling Priors and idiosyncrasies in sensory processing render prior distributions interindividually variable, accounting for large interindividual differences in susceptibility to RHI. Taken together, the proposed model accounts for exceptional malleability of human body perception, fortifies existing bottom-up multisensory integration theories with top-down models of relatedness of sensory cues, and generates testable and disambiguating predictions.


Sign in / Sign up

Export Citation Format

Share Document