scholarly journals Elastic-Arm: Human-scale passive haptic feedback for augmenting interaction and perception in virtual environments

Author(s):  
Merwan Achibet ◽  
Adrien Girard ◽  
Anthony Talvas ◽  
Maud Marchal ◽  
Anatole Lecuyer
2016 ◽  
Vol 25 (1) ◽  
pp. 17-32 ◽  
Author(s):  
Merwan Achibet ◽  
Adrien Girard ◽  
Maud Marchal ◽  
Anatole Lécuyer

Haptic feedback is known to improve 3D interaction in virtual environments but current haptic interfaces remain complex and tailored to desktop interaction. In this paper, we describe an alternative approach called “Elastic-Arm” for incorporating haptic feedback in immersive virtual environments in a simple and cost-effective way. The Elastic-Arm is based on a body-mounted elastic armature that links the user's hand to the body and generates a progressive egocentric force when extending the arm. A variety of designs can be proposed with multiple links attached to various locations on the body in order to simulate different haptic properties and sensations such as different levels of stiffness, weight lifting, and bimanual interaction. Our passive haptic approach can be combined with various 3D interaction techniques and we illustrate the possibilities offered by the Elastic-Arm through several use cases based on well-known techniques such as the Bubble technique, redirected touching, and pseudo-haptics. A user study was conducted which showed the effectiveness of our pseudo-haptic technique as well as the general appreciation of the Elastic-Arm. We believe that the Elastic-Arm could be used in various VR applications which call for mobile haptic feedback or human-scale haptic sensations.


2015 ◽  
Vol 1 (1) ◽  
pp. 160-163 ◽  
Author(s):  
Carsten Neupert ◽  
Sebastian Matich ◽  
Peter P. Pott ◽  
Christian Hatzfeld ◽  
Roland Werthschützky

AbstractPseudo-haptic feedback is a haptic illusion based on a mismatch of haptic and visual perception. It is well known from applications in virtual environments. In this work, we discuss the usabiliy of the principle of pseudo-haptic feedback for teleoperation. Using pseudo-haptic feedback can ease the design of haptic medical tele-operation systems.Thereby a user’s grasping force at an isometric user interface is used to control the closing angle of an end effector of a surgical robot. To provide a realistic haptic feedback, the coupling characteristic of grasping force and end effector closing angle is changed depending on acting end effector interaction forces.With an experiment, we show the usability of pseudo-haptic feedback for discriminating compliances, comparable to the mechanical characteristic of muscles relaxed and contracted. The provided results base upon the data of 10 subjects, and 300 trails.


Author(s):  
Rasul Fesharakifard ◽  
Maryam Khalili ◽  
Laure Leroy ◽  
Alexis Paljic ◽  
Philippe Fuchs

A grasp exoskeleton actuated by a string-based platform is proposed to provide the force feedback for a user’s hand in human-scale virtual environments. The user of this interface accedes to seven active degrees of freedom in interaction with virtual objects, which comprises three degrees of translation, three degrees of rotation, and one degree of grasping. The exoskeleton has a light and ergonomic structure and provides the grasp gesture for five fingers. The actuation of the exoskeleton is performed by eight strings that are the parallel arms of the platform. Each string is connected to a block of motor, rotary encoder, and force sensor with a novel design to create the necessary force and precision for the interface. A hybrid control method based on the string’s tension measured by the force sensor is developed to resolve the ordinary problems of string-based interface. The blocks could be moved on a cubic frame around the virtual environment. Finally the results of preliminary experimentation of interface are presented to show its practical characteristics. Also the interface is mounted on an automotive model to demonstrate its industrial adaptability.


2000 ◽  
Vol 9 (5) ◽  
pp. 486-496 ◽  
Author(s):  
A. C. Boud ◽  
C. Baber ◽  
S. J. Steiner

This paper reports on an investigation into the proposed usability of virtual reality for a manufacturing application such as the assembly of a number of component parts into a final product. Before the assembly task itself is considered, the investigation explores the use of VR for the training of human assembly operators and compares the findings to conventionally adopted techniques for parts assembly. The investigation highlighted several limitations of using VR technology. Most significant was the lack of haptic feedback provided by current input devices for virtual environments. To address this, an instrumented object (IO) was employed that enabled the user to pick up and manipulate the IO as the representation of a component from a product to be assembled. The reported findings indicate that object manipulation times are superior when IOs are employed as the interaction device, and that IO devices could therefore be adopted in VEs to provide haptic feedback for diverse applications and, in particular, for assembly task planning.


Author(s):  
Shujie Deng ◽  
Julie A. Kirkby ◽  
Jian Chang ◽  
Jian Jun Zhang

The goal of this review is to illustrate the emerging use of multimodal virtual reality that can benefit learning-based games. The review begins with an introduction to multimodal virtual reality in serious games and we provide a brief discussion of why cognitive processes involved in learning and training are enhanced under immersive virtual environments. We initially outline studies that have used eye tracking and haptic feedback independently in serious games, and then review some innovative applications that have already combined eye tracking and haptic devices in order to provide applicable multimodal frameworks for learning-based games. Finally, some general conclusions are identified and clarified in order to advance current understanding in multimodal serious game production as well as exploring possible areas for new applications.


2006 ◽  
Vol 5 (2) ◽  
pp. 37-44 ◽  
Author(s):  
Paul Richard ◽  
Damien Chamaret ◽  
François-Xavier Inglese ◽  
Philippe Lucidarme ◽  
Jean-Louis Ferrier

This paper presents a human-scale virtual environment (VE) with haptic feedback along with two experiments performed in the context of product design. The user interacts with a virtual mock-up using a large-scale bimanual string-based haptic interface called SPIDAR (Space Interface Device for Artificial Reality). An original self-calibration method is proposed. A vibro-tactile glove was developed and integrated to the SPIDAR to provide tactile cues to the operator. The purpose of the first experiment was: (1) to examine the effect of tactile feedback in a task involving reach-and-touch of different parts of a digital mock-up, and (2) to investigate the use of sensory substitution in such tasks. The second experiment aimed to investigate the effect of visual and auditory feedback in a car-light maintenance task. Results of the first experiment indicate that the users could easily and quickly access and finely touch the different parts of the digital mock-up when sensory feedback (either visual, auditory, or tactile) was present. Results of the of the second experiment show that visual and auditory feedbacks improve average placement accuracy by about 54 % and 60% respectively compared to the open loop case


Author(s):  
Florian Klompmaker ◽  
Alexander Dridger ◽  
Karsten Nebe

Since 2010 when the Microsoft Kinect with its integrated depth-sensing camera appeared on the market, completely new kinds of interaction techniques have been integrated into console games. They don’t require any instrumentalization and no complicated calibration or time-consuming setup anymore. But even having these benefits, some drawbacks exist. Most games only enable the user to fulfill very simple gestures like waving, jumping or stooping, which is not the natural behavior of a user. In addition the depth-sensing technology lacks of haptic feedback. Of course we cannot solve the lack of haptic feedback, but we want to improve the whole-body interaction. Our goal is to develop 3D interaction techniques that give a maximum of freedom to the user and enable her to perform precise and immersive interactions. This work focuses on whole-body interaction in immersive virtual environments. We present 3D interaction techniques that provide the user with a maximum of freedom and enables her to operate precisely and immersive in virtual environments. Furthermore we present a user study, in which we analyzed how Navigation and Manipulation techniques can be performed by users’ body-interaction using a depth-sensing camera and a huge projection screen. Therefore three alternative approaches have been developed and tested: classical gamepad interaction, an indirect pointer-based interaction and a more direct whole-body interaction technique. We compared their effectiveness and preciseness. It turned out that users act faster, while using the gamepad, but generate significantly more errors at the same time. Using depth-sensing based whole-body interaction techniques it became apparent, that the interaction is much more immersive, natural and intuitive, even if slower. We will show the advantages of our approach and how it can be used in various domains, more effectively and efficiently for their users.


Author(s):  
Evagoras G. Xydas ◽  
Loucas S. Louca

In the area of rehabilitation robotics, many researchers have investigated the therapeutic effects of forces that are proportional to the difference of the user’s hand trajectory with an optimum trajectory that is usually based on the Minimum Jerk Model (MJM). Forces applied in different studies were based on MJM trajectory variables, e.g., velocity, acceleration, position, etc. Consequently, MJM is a key component for upper limb robotic rehabilitation. However, it is critical to establish the validity of this model in the working environment prior of employing it as a reference control function. This work investigates the validity of the MJM under a haptic-virtual environment. The original ‘real’ tests (with no obstacles) that were employed for validating the MJM in planar motion are duplicated in a virtual environment. Haptic feedback is achieved with the use of a Phantom 1.5 High Force haptic interface. Experiments with healthy users are performed to investigate the validity of the MJM in virtual reality conditions. The experiments demonstrated that the MJM is also valid in virtual environments. Nevertheless it was found that in the virtual world, higher time durations are required for completing the tasks than in the real world. The results of this work will be used in the design of haptic-virtual environments for the rehabilitation of upper limbs of people with neuro-disabilities. Therapeutic forces based on the MJM can be applied given that the Minimum Jerk Model is valid in virtual environments.


1999 ◽  
Author(s):  
Wan-Chen Wu ◽  
Cagatay Basdogan ◽  
Mandayam A. Srinivasan

Abstract Human psychophysical experiments were designed and conducted to investigate the effect of 3D perspective visual images on the visual and haptic perception of size and stiffness in multimodal virtual environments (VEs). Virtual slots of varying length and buttons of varying stiffness were displayed to the subjects, who then were asked to discriminate their size and stiffness respectively using visual and/or haptic cues. The results of the size experiments show that under vision alone, farther objects are perceived to be smaller due to perspective cues and the addition of haptic feedback reduces this visual bias. Similarly, the results of the stiffness experiments show that compliant objects that are farther are perceived to be softer when there is only haptic feedback and the addition of visual feedback reduces this haptic bias. Hence, we conclude that our visual and haptic systems compensate for each other such that the sensory information that comes from visual and haptic channels is fused in an optimal manner.


Sign in / Sign up

Export Citation Format

Share Document