scholarly journals Modified sensory feedback enhances the sense of agency during continuous body movements in virtual reality

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Kei Aoyagi ◽  
Wen Wen ◽  
Qi An ◽  
Shunsuke Hamasaki ◽  
Hiroshi Yamakawa ◽  
...  

AbstractThe sense of agency refers to the feeling of control over one’s own actions, and through them, the external events. This study examined the effect of modified visual feedback on the sense of agency over one’s body movements using virtual reality in healthy individuals whose motor control was disturbed. Participants moved a virtual object using their right hand to trace a trajectory (Experiment 1) or a leading target (Experiment 2). Their motor control was disturbed by a delay in visual feedback (Experiment 1) or a 1-kg weight attached to their wrist (Experiment 2). In the offset conditions, the virtual object was presented at the median point between the desired position and the participants’ actual hand position. In both experiments, participants reported improved sense of agency in the offset condition compared to the aligned condition where the visual feedback reflected their actual body movements, despite their motion being less precise in the offset condition. The results show that sense of agency can be enhanced by modifying feedback to motor tasks according to the goal of the task, even when visual feedback is discrepant from the actual body movements. The present study sheds light on the possibility of artificially enhancing body agency to improve voluntary motor control.

2016 ◽  
Vol 25 (3) ◽  
pp. 222-233 ◽  
Author(s):  
Jakki O. Bailey ◽  
Jeremy N. Bailenson ◽  
Daniel Casasanto

Can an avatar’s body movements change a person’s perception of good and bad? We discuss virtual embodiment according to theories of embodied cognition (EC), and afferent and sensorimotor correspondences. We present an example study using virtual reality (VR) to test EC theory, testing the effect of altered virtual embodiment on perception. Participants either controlled an avatar whose arm movements were similar to their own or reflected the mirror opposite of their arm movements. We measured their associations of “good” and “bad” with the left and right (i.e., space-valence associations). This study demonstrated how VR could be used to examine the possible ways that systems of the body (e.g., visual, motor) may interact to influence cognition. The implications of this research suggest that visual feedback alone is not enough to alter space-valence associations. Multiple sensory experiences of media (i.e., sensorimotor feedback) may be necessary to influence cognition, not simply visual feedback.


2009 ◽  
Vol 18 (1) ◽  
pp. 39-53 ◽  
Author(s):  
Anatole Lécuyer

This paper presents a survey of the main results obtained in the field of “pseudo-haptic feedback”: a technique meant to simulate haptic sensations in virtual environments using visual feedback and properties of human visuo-haptic perception. Pseudo-haptic feedback uses vision to distort haptic perception and verges on haptic illusions. Pseudo-haptic feedback has been used to simulate various haptic properties such as the stiffness of a virtual spring, the texture of an image, or the mass of a virtual object. This paper describes the several experiments in which these haptic properties were simulated. It assesses the definition and the properties of pseudo-haptic feedback. It also describes several virtual reality applications in which pseudo-haptic feedback has been successfully implemented, such as a virtual environment for vocational training of milling machine operations, or a medical simulator for training in regional anesthesia procedures.


2003 ◽  
Vol 12 (2) ◽  
pp. 140-155 ◽  
Author(s):  
Roy A. Ruddle ◽  
Justin C. D. Savage ◽  
Dylan M. Jones

Three experiments investigated the effect of implementing low-level aspects of motor control for a collaborative carrying task within a VE interface, leaving participants free to devote their cognitive resources to the higher-level components of the task. In the task, participants collaborated with an autonomous virtual human in an immersive virtual environment (VE) to carry an object along a predefined path. In experiment 1, participants took up to three times longer to perform the task with a conventional VE interface, in which they had to explicitly coordinate their hand and body movements, than with an interface that controlled the low-level tasks of grasping and holding onto the virtual object. Experiments 2 and 3 extended the study to include the task of carrying an object along a path that contained obstacles to movement. By allowing participants' virtual arms to stretch slightly, the interface software was able to take over some aspects of obstacle avoidance (another low-level task), and this led to further significant reductions in the time that participants took to perform the carrying task. Improvements in performance also occurred when participants used a tethered viewpoint to control their movements because they could see their immediate surroundings in the VEs. This latter finding demonstrates the superiority of a tethered view perspective to a conventional, human'seye perspective for this type of task.


Electronics ◽  
2021 ◽  
Vol 10 (9) ◽  
pp. 1069
Author(s):  
Deyby Huamanchahua ◽  
Adriana Vargas-Martinez ◽  
Ricardo Ramirez-Mendoza

Exoskeletons are an external structural mechanism with joints and links that work in tandem with the user, which increases, reinforces, or restores human performance. Virtual Reality can be used to produce environments, in which the intensity of practice and feedback on performance can be manipulated to provide tailored motor training. Will it be possible to combine both technologies and have them synchronized to reach better performance? This paper consists of the kinematics analysis for the position and orientation synchronization between an n DoF upper-limb exoskeleton pose and a projected object in an immersive virtual reality environment using a VR headset. To achieve this goal, the exoskeletal mechanism is analyzed using Euler angles and the Pieper technique to obtain the equations that lead to its orientation, forward, and inverse kinematic models. This paper extends the author’s previous work by using an early stage upper-limb exoskeleton prototype for the synchronization process.


2021 ◽  
pp. 1-18
Author(s):  
Sicong Liu ◽  
Jillian M. Clements ◽  
Elayna P. Kirsch ◽  
Hrishikesh M. Rao ◽  
David J. Zielinski ◽  
...  

Abstract The fusion of immersive virtual reality, kinematic movement tracking, and EEG offers a powerful test bed for naturalistic neuroscience research. Here, we combined these elements to investigate the neuro-behavioral mechanisms underlying precision visual–motor control as 20 participants completed a three-visit, visual–motor, coincidence-anticipation task, modeled after Olympic Trap Shooting and performed in immersive and interactive virtual reality. Analyses of the kinematic metrics demonstrated learning of more efficient movements with significantly faster hand RTs, earlier trigger response times, and higher spatial precision, leading to an average of 13% improvement in shot scores across the visits. As revealed through spectral and time-locked analyses of the EEG beta band (13–30 Hz), power measured prior to target launch and visual-evoked potential amplitudes measured immediately after the target launch correlate with subsequent reactive kinematic performance in the shooting task. Moreover, both launch-locked and shot/feedback-locked visual-evoked potentials became earlier and more negative with practice, pointing to neural mechanisms that may contribute to the development of visual–motor proficiency. Collectively, these findings illustrate EEG and kinematic biomarkers of precision motor control and changes in the neurophysiological substrates that may underlie motor learning.


10.5772/51139 ◽  
2012 ◽  
Author(s):  
Kenji Sato ◽  
Satoshi Fukumori ◽  
Kantaro Miyake ◽  
Daniel Obata ◽  
Akio Gofuku ◽  
...  

2018 ◽  
Vol 18 (2) ◽  
pp. 30-57
Author(s):  
Shamima Yasmin

This paper conducts an extensive survey on existing Virtual Reality (VR)-based rehabilitation approaches in the context of different types of impairments: mobility, cognitive, and visual. Some VR-based assistive technologies involve repetitions of body movements, some require persistent mental exercise, while some work as sensory substitution systems. A multi-modal VR-based environment can incorporate a number of senses, (i.e., visual, auditory, or haptic) into the system and can be an immense source of motivation and engagement in comparison with traditional rehabilitation therapy. This survey categorizes virtual environments on the basis of different available modalities. Each category is again subcategorized by the types of impairments while introducing available devices and interfaces. Before concluding the survey, the paper also briefly focuses on some issues with existing VR-based approaches that need to be optimized to exploit the utmost benefit of virtual environment-based rehabilitation systems .


Sign in / Sign up

Export Citation Format

Share Document