A Smart Interface HUD Optimized for VR HMD and Leap Motion

Author(s):  
Keonhee Park ◽  
Seongah Chin

Virtual reality head-mounted displays (VR HMDs) can provide three-dimensional content to increase user immersion that is relatively acceptable. However, user interactions with VR HMDs often do not satisfy a virtual environment. In this article, the authors propose a smart human–computer interaction technique that runs on VR HMDs and Leap Motion. In this study, they focused on improving interaction methods for VR content and providing interfaces and information necessary for the operation of such content in a user-friendly manner, which should reduce fatigue and dizziness. To this end, they propose and validate a smart head-up display that is suitable for VR content. Specifically, a virtual head-up display (HUD) was designed and optimized for use with VR HMD and Leap Motion technologies. Additionally, they present comparative performance tests to validate the proposed system.

2021 ◽  
Vol 15 (2) ◽  
pp. 49-55
Author(s):  
Dino Caesaron ◽  
Rio Prasetyo Lukodono ◽  
Yunita Nugrahaini Safrudin

The interaction of user performance with three-dimensional (3D) objects has become an important issue in the recent development of virtual reality applications. Additionally, the basic conviction of current Virtual Reality (VR) supports the development of the viable interface between humans and machines. The research focuses on the user’s interaction technique by considering two approaches (direct and indirect interaction techniques) for the users while interacting with threedimensional objects. Numerous possible uses can benefit from virtual reality by considering a few fundamental visual and cognitive activities in the Virtual Environment (VE), such as the interpretation of space that users of clear and indirect perception are not well established. The experiment is performed in a stereoscopic environment using a reciprocal tapping task. Participants are expected to use direct pointing as well as indirect cursor techniques to select a stereoscopic spherical target. The results show that, in the sense of a direct interaction technique, user recognition of an object appears to converge in the center of a simulated area. Unfortunately, this convergence is not demonstrated in the indirect cursor situation. The pointing estimation from the users is more accurate when using the indirect interaction approach. The findings provide an understanding of the interaction characteristics done by the users in the stereoscopic environment. Importantly, developers of a virtual environment may use the result when developing effective user interface perception in specific interaction techniques.


Author(s):  
Stefan Bittmann

Virtual reality (VR) is the term used to describe representation and perception in a computer-generated, virtual environment. The term was coined by author Damien Broderick in his 1982 novel “The Judas Mandala". The term "Mixed Reality" describes the mixing of virtual reality with pure reality. The term "hyper-reality" is also used. Immersion plays a major role here. Immersion describes the embedding of the user in the virtual world. A virtual world is considered plausible if the interaction is logical in itself. This interactivity creates the illusion that what seems to be happening is actually happening. A common problem with VR is "motion sickness." To create a sense of immersion, special output devices are needed to display virtual worlds. Here, "head-mounted displays", CAVE and shutter glasses are mainly used. Input devices are needed for interaction: 3D mouse, data glove, flystick as well as the omnidirectional treadmill, with which walking in virtual space is controlled by real walking movements, play a role here.


Author(s):  
Tushar H. Dani ◽  
Rajit Gadh

Abstract Despite advances in Computer-Aided Design (CAD) and the evolution of the graphical user interfaces, rapid creation, editing and visualization of three-dimensional (3D) shapes remains a tedious task. Though the availability of Virtual Reality (VR)-based systems allows enhanced three-dimensional interaction and visualization, the use of VR for ab initio shape design, as opposed to ‘importing’ models from existing CAD systems, is a relatively new area of research. Of interest are computer-human interaction issues and the design and geometric tools for shape modeling in a Virtual Environment (VE). The focus of this paper is on the latter i.e. in defining the geometric tools required for a VR-CAD system and in describing a framework that meets those requirements. This framework, the Virtual Design Software Framework (VDSF) consists of the interaction and design tools, and an underlying geometric engine that provides the representation and algorithms required by these tools. The geometric engine called the Virtual Modeler uses a graph-based representation (Shape-Graph) for modeling the shapes created by the user. The Shape-Graph facilitates interactive editing by localizing the effect of editing operations and in addition provides constraint-based design and editing mechanisms that are useful in a 3D interactive virtual environment. The paper concludes with a description of the prototype system, called the Virtual Design Studio (VDS), that is currently being implemented.1.


2010 ◽  
pp. 180-193 ◽  
Author(s):  
F. Steinicke ◽  
G. Bruder ◽  
J. Jerald ◽  
H. Frenz

In recent years virtual environments (VEs) have become more and more popular and widespread due to the requirements of numerous application areas in particular in the 3D city visualization domain. Virtual reality (VR) systems, which make use of tracking technologies and stereoscopic projections of three-dimensional synthetic worlds, support better exploration of complex datasets. However, due to the limited interaction space usually provided by the range of the tracking sensors, users can explore only a portion of the virtual environment (VE). Redirected walking allows users to walk through large-scale immersive virtual environments (IVEs) such as virtual city models, while physically remaining in a reasonably small workspace by intentionally injecting scene motion into the IVE. With redirected walking users are guided on physical paths that may differ from the paths they perceive in the virtual world. The authors have conducted experiments in order to quantify how much humans can unknowingly be redirected. In this chapter they present the results of this study and the implications for virtual locomotion user interfaces that allow users to view arbitrary real world locations, before the users actually travel there in a natural environment.


Author(s):  
David Sproule ◽  
Rosemarie Figueroa Jacinto ◽  
Steve Rundell ◽  
Jacob Williams ◽  
Sam Perlmutter ◽  
...  

Virtual reality (VR) and personal head-mounted displays (HMDs) can be a viable tool for the presentation of scientifically accurate and valid demonstrative data in the courtroom. However, the capabilities and limitations of the technology need to be fully characterized. The current pilot study evaluated visual acuity and contrast sensitivity using two commercially available HMDs (Oculus Rift and HTC Vive Pro). Preliminary findings indicated that visual acuity and contrast sensitivity experienced in VR may be less than what is experienced in real-world scenarios. The current pilot study provides a quantitative approach for characterizing the limitations of VR with respect to visual acuity and contrast sensitivity, and provides recommendations for the appropriate use of this technology when performing forensic investigations and developing visualization tools.


2005 ◽  
Vol 32 (5) ◽  
pp. 777-785 ◽  
Author(s):  
Ebru Cubukcu ◽  
Jack L Nasar

Discrepanices between perceived and actual distance may affect people's spatial behavior. In a previous study Nasar, using self report of behavior, found that segmentation (measured through the number of buildings) along the route affected choice of parking garage and path from the parking garage to a destination. We recreated that same environment in a three-dimensional virtual environment and conducted a test to see whether the same factors emerged under these more controlled conditions and to see whether spatial behavior in the virtual environment accurately reflected behavior in the real environment. The results confirmed similar patterns of response in the virtual and real environments. This supports the use of virtual reality as a tool for predicting behavior in the real world and confirms increases in segmentation as related to increases in perceived distance.


2020 ◽  
Author(s):  
Simone Grassini ◽  
Karin Laumann ◽  
Ann Kristin Luzi

Many studies have attempted to understand which individual differences may be related to the symptoms of discomfort during the virtual experience (simulator sickness) and the generally considered positive sense of being inside the simulated scene (sense of presence). Nevertheless, due to the quick technological advancement in the field of virtual reality, most of these studies are now outdated. Advanced technology for virtual reality is commonly mediated by head-mounted displays (HMDs), which aim to increase the sense of the presence of the user, remove stimuli from the external environment, and provide high definition, photo-realistic, three-dimensional images. Our results showed that motion sickness susceptibility and simulator sickness are related and neuroticism may be associated and predict simulator sickness. Furthermore, the results showed that people who are more used to playing video-games are less susceptible to simulator sickness; female participants reported more simulator sickness compared to males (but only for nausea-related symptoms). Female participants also experienced a higher sense of presence compared to males. We suggests that published findings on simulator sickness and the sense of presence in virtual reality environments need to be replicated with the use of modern HMDs.


Author(s):  
Mikhail Mikhaylyuk ◽  
Andrey Maltsev ◽  
Evgeny Strashnov

This paper presents original solutions for creation of training complex learning cosmonauts to control a space jet pack on purpose self-rescue when emergency happens. An approach is proposed in which training is carried out in a virtual environment using virtual reality gloves and headset. The idea is that control of virtual space jet pack model is performed by interaction of virtual hands, copying movements of cosmonaut's hands, with three-dimensional model of jet pack's control panel. To implement the training complex, methods and approaches were developed for movement synchronization simulation of virtual and real hands, as well as simulation of jet pack's control panel and thrusters. Approbation of proposed methods and approaches was carried out as part of our virtual environment system VirSim developed at the SRISA RAS. Results obtained in the paper can be used to create training complex for learning cosmonauts to rescue when they accidentally separate from the International Space Station.


2008 ◽  
Vol 575-578 ◽  
pp. 709-715 ◽  
Author(s):  
Zhen Xin Liang ◽  
Jian Xun Zhang ◽  
Yi Pei

Gas shield Tungsten Arc Welding (GTAW) is an important process method in material processing. Welding quality is vital for the product quality. It is an important way to improve the quality of product by raising the personal capability and handling technique of the welder. In this investigation, because of the deficiency in classic welding training scheme, some new technology were introduced into classic welding training field to improve the efficiency of welding training and reduce training cost. A computer simulation system that can be used to train primary welders was developed combined with three-dimensional stereoscopic vision and API interface of OpenGL, virtual reality is the kernel technology. In this system, welders were trained not in the real operating environment but in the virtual environment where has experience personally effect that simulated by computer. There is unnecessary with welding material and welding energy in virtual environment simulated by computer. It has highly automatic and intelligent and lower required to welding teachers. It is healthy to welder because that intensive arc and harmful dust are disappeared in virtual environment. It is a lower cost and high efficiency method by use of virtual training system to training new welder.


Author(s):  
Hugo I. Medellín-Castillo ◽  
Germánico González-Badillo ◽  
Eder Govea ◽  
Raquel Espinosa-Castañeda ◽  
Enrique Gallegos

The technological growth in the last years have conducted to the development of virtual reality (VR) systems able to immerse the user into a three-dimensional (3D) virtual environment where the user can interact in real time with virtual objects. This interaction is mainly based on visualizing the virtual environment and objects. However, with the recent beginning of haptic systems, the interaction with the virtual world has been extended to also feel, touch and manipulate virtual objects. Virtual reality has been successfully used in the development of applications in different scientific areas ranging from basic sciences, social science, education and entertainment. On the other hand, the use of haptics has increased in the last decade in domains from sciences and engineering to art and entertainment. Despite many developments, there is still relatively little knowledge about the confluence of software, enabling hardware, visual and haptic representations, to enable the conditions that best provide for an immersive sensory environment to convey information about a particular subject domain. In this paper, the state of the art of the research work regarding virtual reality and haptic technologies carried out by the authors in the last years is presented. The aim is to evidence the potential use of these technologies to develop usable systems for analysis and simulation in different areas of knowledge. The development of three different systems in the areas of engineering, medicine and art is presented. In the area of engineering, a system for the planning, evaluation and training of assembly and manufacturing tasks has been developed. The system, named as HAMS (Haptic Assembly and Manufacturing System), is able to simulate assembly tasks of complex components with force feedback provided by the haptic device. On the other hand, in the area of medicine, a surgical simulator for planning and training orthognathic surgeries has been developed. The system, named as VOSS (Virtual Osteotomy Simulator System), allows the realization of virtual osteotomies with force feedback. Finally, in the area of art, an interactive cinema system for blind people has been developed. The system is able to play a 3D virtual movie for the blind user to listen to and touch by means of the haptic device. The development of these applications and the results obtained from these developments are presented and discussed in this paper.


Sign in / Sign up

Export Citation Format

Share Document