Evaluation of Whole-Body Navigation and Selection Techniques in Immersive 3D Environments

Author(s):  
Florian Klompmaker ◽  
Alexander Dridger ◽  
Karsten Nebe

Since 2010 when the Microsoft Kinect with its integrated depth-sensing camera appeared on the market, completely new kinds of interaction techniques have been integrated into console games. They don’t require any instrumentalization and no complicated calibration or time-consuming setup anymore. But even having these benefits, some drawbacks exist. Most games only enable the user to fulfill very simple gestures like waving, jumping or stooping, which is not the natural behavior of a user. In addition the depth-sensing technology lacks of haptic feedback. Of course we cannot solve the lack of haptic feedback, but we want to improve the whole-body interaction. Our goal is to develop 3D interaction techniques that give a maximum of freedom to the user and enable her to perform precise and immersive interactions. This work focuses on whole-body interaction in immersive virtual environments. We present 3D interaction techniques that provide the user with a maximum of freedom and enables her to operate precisely and immersive in virtual environments. Furthermore we present a user study, in which we analyzed how Navigation and Manipulation techniques can be performed by users’ body-interaction using a depth-sensing camera and a huge projection screen. Therefore three alternative approaches have been developed and tested: classical gamepad interaction, an indirect pointer-based interaction and a more direct whole-body interaction technique. We compared their effectiveness and preciseness. It turned out that users act faster, while using the gamepad, but generate significantly more errors at the same time. Using depth-sensing based whole-body interaction techniques it became apparent, that the interaction is much more immersive, natural and intuitive, even if slower. We will show the advantages of our approach and how it can be used in various domains, more effectively and efficiently for their users.

2016 ◽  
Vol 25 (1) ◽  
pp. 17-32 ◽  
Author(s):  
Merwan Achibet ◽  
Adrien Girard ◽  
Maud Marchal ◽  
Anatole Lécuyer

Haptic feedback is known to improve 3D interaction in virtual environments but current haptic interfaces remain complex and tailored to desktop interaction. In this paper, we describe an alternative approach called “Elastic-Arm” for incorporating haptic feedback in immersive virtual environments in a simple and cost-effective way. The Elastic-Arm is based on a body-mounted elastic armature that links the user's hand to the body and generates a progressive egocentric force when extending the arm. A variety of designs can be proposed with multiple links attached to various locations on the body in order to simulate different haptic properties and sensations such as different levels of stiffness, weight lifting, and bimanual interaction. Our passive haptic approach can be combined with various 3D interaction techniques and we illustrate the possibilities offered by the Elastic-Arm through several use cases based on well-known techniques such as the Bubble technique, redirected touching, and pseudo-haptics. A user study was conducted which showed the effectiveness of our pseudo-haptic technique as well as the general appreciation of the Elastic-Arm. We believe that the Elastic-Arm could be used in various VR applications which call for mobile haptic feedback or human-scale haptic sensations.


2016 ◽  
Vol 78 (12-3) ◽  
Author(s):  
Arief Hydayat ◽  
Haslina Arshad ◽  
Nazlena Mohamad Ali ◽  
Lam Meng Chun

In a 3D user interface, interaction plays an important role in helping users to manipulate 3D objects in virtual environments. 3D devices, such as data glove and motion tracking, can potentially give users the opportunity to manipulate 3D objects in virtual reality environments such as checking, zooming, translating, rotating, merging and splitting 3D objects in a more natural and easy manner through the use of hand gestures. Hand gestures are often applied in 3D interaction techniques for converting the manipulation mode. This paper will discuss the interaction technique in a virtual environment using a combination of the Push and Pull navigation and the rotation technique. The unimanual use of these 3D interaction techniques can improve the effectiveness of users in their interaction with and manipulation of 3D objects. This study has enhanced the capability of the unimanual 3D interaction technique in terms of 3D interaction feedback in virtual environments.


Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3673
Author(s):  
Stefan Grushko ◽  
Aleš Vysocký ◽  
Petr Oščádal ◽  
Michal Vocetka ◽  
Petr Novák ◽  
...  

In a collaborative scenario, the communication between humans and robots is a fundamental aspect to achieve good efficiency and ergonomics in the task execution. A lot of research has been made related to enabling a robot system to understand and predict human behaviour, allowing the robot to adapt its motion to avoid collisions with human workers. Assuming the production task has a high degree of variability, the robot’s movements can be difficult to predict, leading to a feeling of anxiety in the worker when the robot changes its trajectory and approaches since the worker has no information about the planned movement of the robot. Additionally, without information about the robot’s movement, the human worker cannot effectively plan own activity without forcing the robot to constantly replan its movement. We propose a novel approach to communicating the robot’s intentions to a human worker. The improvement to the collaboration is presented by introducing haptic feedback devices, whose task is to notify the human worker about the currently planned robot’s trajectory and changes in its status. In order to verify the effectiveness of the developed human-machine interface in the conditions of a shared collaborative workspace, a user study was designed and conducted among 16 participants, whose objective was to accurately recognise the goal position of the robot during its movement. Data collected during the experiment included both objective and subjective parameters. Statistically significant results of the experiment indicated that all the participants could improve their task completion time by over 45% and generally were more subjectively satisfied when completing the task with equipped haptic feedback devices. The results also suggest the usefulness of the developed notification system since it improved users’ awareness about the motion plan of the robot.


2020 ◽  
Vol 4 (4) ◽  
pp. 78
Author(s):  
Andoni Rivera Pinto ◽  
Johan Kildal ◽  
Elena Lazkano

In the context of industrial production, a worker that wants to program a robot using the hand-guidance technique needs that the robot is available to be programmed and not in operation. This means that production with that robot is stopped during that time. A way around this constraint is to perform the same manual guidance steps on a holographic representation of the digital twin of the robot, using augmented reality technologies. However, this presents the limitation of a lack of tangibility of the visual holograms that the user tries to grab. We present an interface in which some of the tangibility is provided through ultrasound-based mid-air haptics actuation. We report a user study that evaluates the impact that the presence of such haptic feedback may have on a pick-and-place task of the wrist of a holographic robot arm which we found to be beneficial.


Sign in / Sign up

Export Citation Format

Share Document