scholarly journals Seeing pain and pleasure on self and others: behavioral and psychophysiological reactivity in immersive virtual reality

2016 ◽  
Vol 116 (6) ◽  
pp. 2656-2662 ◽  
Author(s):  
M. Fusaro ◽  
G. Tieri ◽  
S. M. Aglioti

Studies have explored behavioral and neural responses to the observation of pain in others. However, much less is known about how taking a physical perspective influences reactivity to the observation of others' pain and pleasure. To explore this issue we devised a novel paradigm in which 24 healthy participants immersed in a virtual reality scenario observed a virtual: needle penetrating (pain), caress (pleasure), or ball touching (neutral) the hand of an avatar seen from a first (1PP)- or a third (3PP)-person perspective. Subjective ratings and physiological responses [skin conductance responses (SCR) and heart rate (HR)] were collected in each trial. All participants reported strong feelings of ownership of the virtual hand only in 1PP. Subjective measures also showed that pain and pleasure were experienced as more salient than neutral. SCR analysis demonstrated higher reactivity in 1PP than in 3PP. Importantly, vicarious pain induced stronger responses with respect to the other conditions in both perspectives. HR analysis revealed equally lower activity during pain and pleasure with respect to neutral. SCR may reflect egocentric perspective, and HR may merely index general arousal. The results suggest that behavioral and physiological indexes of reactivity to seeing others' pain and pleasure were qualitatively similar in 1PP and 3PP. Our paradigm indicates that virtual reality can be used to study vicarious sensation of pain and pleasure without actually delivering any stimulus to participants' real body and to explore behavioral and physiological reactivity when they observe pain and pleasure from ego- and allocentric perspectives.

Perception ◽  
2018 ◽  
Vol 47 (5) ◽  
pp. 477-491 ◽  
Author(s):  
Barbara Caola ◽  
Martina Montalti ◽  
Alessandro Zanini ◽  
Antony Leadbetter ◽  
Matteo Martini

Classically, body ownership illusions are triggered by cross-modal synchronous stimulations, and hampered by multisensory inconsistencies. Nonetheless, the boundaries of such illusions have been proven to be highly plastic. In this immersive virtual reality study, we explored whether it is possible to induce a sense of body ownership over a virtual body part during visuomotor inconsistencies, with or without the aid of concomitant visuo-tactile stimulations. From a first-person perspective, participants watched a virtual tube moving or an avatar’s arm moving, with or without concomitant synchronous visuo-tactile stimulations on their hand. Three different virtual arm/tube speeds were also investigated, while all participants kept their real arms still. The subjective reports show that synchronous visuo-tactile stimulations effectively counteract the effect of visuomotor inconsistencies, but at slow arm movements, a feeling of body ownership might be successfully induced even without concomitant multisensory correspondences. Possible therapeutical implications of these findings are discussed.


2019 ◽  
Vol 9 (5) ◽  
pp. 449-460 ◽  
Author(s):  
Calum Gordon ◽  
Alba Barbullushi ◽  
Stefano Tombolini ◽  
Federica Margiotta ◽  
Alessia Ciacci ◽  
...  

Aim: Evidence has revealed a relationship between pain and the observation of limb movement, but it is unknown whether different types of movements have diverse modulating effects. In this immersive virtual reality study, we explored the effect of the vision of different virtual arm movements (arm vs wrist) on heat pain threshold of healthy participants. Patients & methods: 40 healthy participants underwent four conditions in virtual reality, while heat pain thresholds were measured. Visuo–tactile stimulation was used to attempt to modulate the feeling of virtual limb ownership while the participants kept their arms still. Results: Effects on pain threshold were present for type of stimulation but not type of movement. Conclusion: The type of observed movement does not appear to influence pain modulation, at least not during acute pain states.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Jörgen Rosén ◽  
Granit Kastrati ◽  
Aksel Reppling ◽  
Klas Bergkvist ◽  
Fredrik Åhs

AbstractVirtual reality lets the user be immersed in a 3-dimensional environment, which can enhance certain emotional responses to stimuli relative to experiencing them on a flat computer screen. We here tested whether displaying two different types of threats in immersive virtual reality enhanced threat related autonomic responses measured by skin conductance responses (SCRs). We studied innate and learned threat responses because these types of threats have been shown to depend on different neural circuits in animals. Therefore, it is possible that immersive virtual reality may modulate one of these threats but not the other. Innate threat responses were provoked by the sudden appearance of characters at proximal egocentric distance, which were compared to the sudden appearance of distant characters (proximal threat). Learned threat responses were studied by conditioning two of the characters to an electric shock (conditioned threat) and contrasting SCRs to these characters with SCRs to two other characters that were never paired with shock. We found that displaying stimuli in immersive virtual reality enhanced proximal threat responses but not conditioned threat responses. Findings show that immersive virtual reality can enhance an innate type of threat responses without affecting a learned threat response, suggesting that separate neural pathways serve these threat responses.


2018 ◽  
Vol 120 (3) ◽  
pp. 1107-1118 ◽  
Author(s):  
Rachele Pezzetta ◽  
Valentina Nicolardi ◽  
Emmanuele Tidoni ◽  
Salvatore Maria Aglioti

Detecting errors in one’s own actions, and in the actions of others, is a crucial ability for adaptable and flexible behavior. Studies show that specific EEG signatures underpin the monitoring of observed erroneous actions (error-related negativity, error positivity, mid-frontal theta oscillations). However, the majority of studies on action observation used sequences of trials where erroneous actions were less frequent than correct actions. Therefore, it was not possible to disentangle whether the activation of the performance monitoring system was due to an error, as a violation of the intended goal, or to a surprise/novelty effect, associated with a rare and unexpected event. Combining EEG and immersive virtual reality (IVR-CAVE system), we recorded the neural signal of 25 young adults who observed, in first-person perspective, simple reach-to-grasp actions performed by an avatar aiming for a glass. Importantly, the proportion of erroneous actions was higher than correct actions. Results showed that the observation of erroneous actions elicits the typical electrocortical signatures of error monitoring, and therefore the violation of the action goal is still perceived as a salient event. The observation of correct actions elicited stronger alpha suppression. This confirmed the role of the alpha-frequency band in the general orienting response to novel and infrequent stimuli. Our data provide novel evidence that an observed goal error (the action slip) triggers the activity of the performance-monitoring system even when erroneous actions, which are, typically, relevant events, occur more often than correct actions and thus are not salient because of their rarity. NEW & NOTEWORTHY Activation of the performance-monitoring system (PMS) is typically investigated when errors in a sequence are comparatively rare. However, whether the PMS is activated by errors per se or by their infrequency is not known. Combining EEG-virtual reality techniques, we found that observing frequent (70%) action errors performed by avatars elicits electrocortical error signatures suggesting that deviation from the prediction of how learned actions should correctly deploy, rather than its frequency, is coded in the PMS.


Author(s):  
Amanda J. Haskins ◽  
Jeff Mentch ◽  
Thomas L. Botch ◽  
Caroline E. Robertson

AbstractVision is an active process. Humans actively sample their sensory environment via saccades, head turns, and body movements. Yet, little is known about active visual processing in real-world environments. Here, we exploited recent advances in immersive virtual reality (VR) and in-headset eye-tracking to show that active viewing conditions impact how humans process complex, real-world scenes. Specifically, we used quantitative, model-based analyses to compare which visual features participants prioritize over others while encoding a novel environment in two experimental conditions: active and passive. In the active condition, participants used head-mounted VR displays to explore 360º scenes from a first-person perspective via self-directed motion (saccades and head turns). In the passive condition, 360º scenes were passively displayed to participants within the VR headset while they were head-restricted. Our results show that signatures of top-down attentional guidance increase in active viewing conditions: active viewers disproportionately allocate their attention to semantically relevant scene features, as compared with passive viewers. We also observed increased signatures of exploratory behavior in eye movements, such as quicker, more entropic fixations during active as compared with passive viewing conditions. These results have broad implications for studies of visual cognition, suggesting that active viewing influences every aspect of gaze behavior – from the way we move our eyes to what we choose to attend to – as we construct a sense of place in a real-world environment.Significance StatementEye-tracking in immersive virtual reality offers an unprecedented opportunity to study human gaze behavior under naturalistic viewing conditions without sacrificing experimental control. Here, we advanced this new technique to show how humans deploy attention as they encode a diverse set of 360º, real-world scenes, actively explored from a first-person perspective using head turns and saccades. Our results build on classic studies in psychology, showing that active, as compared with passive, viewing conditions fundamentally alter perceptual processing. Specifically, active viewing conditions increase information-seeking behavior in humans, producing faster, more entropic fixations, which are disproportionately deployed to scene areas that are rich in semantic meaning. In addition, our results offer key benchmark measurements of gaze behavior in 360°, naturalistic environments.


2020 ◽  
Vol 9 (2) ◽  
pp. 291 ◽  
Author(s):  
Marta Matamala-Gomez ◽  
Birgit Nierula ◽  
Tony Donegan ◽  
Mel Slater ◽  
Maria V. Sanchez-Vives

Changes in body representation may affect pain perception. The effect of a distorted body image, such as the telescoping effect in amputee patients, on pain perception, is unclear. This study aimed to investigate whether distorting an embodied virtual arm in virtual reality (simulating the telescoping effect in amputees) modulated pain perception and anticipatory responses to pain in healthy participants. Twenty-seven right-handed participants were immersed in virtual reality and the virtual arm was shown with three different levels of distortion with a virtual threatening stimulus either approaching or contacting the virtual hand. We evaluated pain/discomfort ratings, ownership, and skin conductance responses (SCRs) after each condition. Viewing a distorted virtual arm enhances the SCR to a threatening event with respect to viewing a normal control arm, but when viewing a reddened-distorted virtual arm, SCR was comparatively reduced in response to the threat. There was a positive relationship between the level of ownership over the distorted and reddened-distorted virtual arms with the level of pain/discomfort, but not in the normal control arm. Contact with the threatening stimulus significantly enhances SCR and pain/discomfort, while reduced SCR and pain/discomfort were seen in the simulated-contact condition. These results provide further evidence of a bi-directional link between body image and pain perception.


Sign in / Sign up

Export Citation Format

Share Document