scholarly journals Enhancing User Immersion and Virtual Presence in Interactive Multiuser Virtual Environments through the Development and Integration of a Gesture-Centric Natural User Interface Developed from Existing Virtual Reality Technologies

Author(s):  
Chika Emma-Ogbangwo ◽  
Nick Cope ◽  
Reinhold Behringer ◽  
Marc Fabri
2017 ◽  
Vol 2017 (3) ◽  
pp. 60-63 ◽  
Author(s):  
Xin Tong ◽  
Serkan Pekcetin ◽  
Diane Gromala ◽  
Frederico Machuca

2021 ◽  
Author(s):  
Yi Gao ◽  
Cheng Chang ◽  
Xiaxia Yu ◽  
Pengjin Pang ◽  
Nian Xiong ◽  
...  

AbstractVolume rendering produces informative two-dimensional (2D) images from a 3-dimensional (3D) volume. It highlights the region of interest and facilitates a good comprehension of the entire data set. However, volume rendering faces a few challenges. First, a high-dimensional transfer function is usually required to differentiate the target from its neighboring objects with subtle variance. Unfortunately, designing such a transfer function is a strenuously trial-and-error process. Second, manipulating/visualizing a 3D volume with a traditional 2D input/output device suffers dimensional limitations. To address all the challenges, we design NUI-VR$$^2$$ 2 , a natural user interface-enabled volume rendering system in the virtual reality space. NUI-VR$$^2$$ 2 marries volume rendering and interactive image segmentation. It transforms the original volume into a probability map with image segmentation. A simple linear transfer function will highlight the target well in the probability map. More importantly, we set the entire image segmentation and volume rendering pipeline in an immersive virtual reality environment with a natural user interface. NUI-VR$$^2$$ 2 eliminates the dimensional limitations in manipulating and perceiving 3D volumes and dramatically improves the user experience.


Author(s):  
Daniele Regazzoni ◽  
Caterina Rizzi ◽  
Andrea Vitali

The Natural User Interface (NUI), which permits a simple and consistent user’s interaction, represents a meaningful challenge for developing virtual/augmented reality applications. This paper presents a set of guidelines to design optimal NUI as well as a software framework, named FrameworkVR, which encapsulates the rules of presented guidelines. FrameworkVR allows developing NUI for VR/AR reality applications based on Oculus Rift, Leap Motions device and on the VTK open source library. An example of VR application for prosthesis design developed using FrameworkVR, is also described. Tests have been carried to validate the approach and the designed NUI and results reached so far are presented and discussed.


2015 ◽  
Vol 3 (7) ◽  
pp. 147-164
Author(s):  
Kenneth A. Ritter ◽  
Christoph W. Borst ◽  
Terrence L. Chambers

As the interest in Virtual Reality (VR) increases, so does the number of software toolkits available for various VR applications. Given that more games are being made with the Unity game engine than any other game technology, several of these toolkits are developed to be directly imported into Unity. A feature and interaction comparison of the toolkits is needed by Unity developers to properly suit one for a specific application. This paper presents an overview and comparison of several virtual reality toolkits available for developers using the Unity game engine. getReal3D, MiddleVR, and Reality-based User Interface System (RUIS) are analysed for VR interaction and display on multi-projection immersive environments like Cave Automatic Virtual Environments (CAVE)s. MiddleVR was found to have the highest performance and most versatile toolkit for CAVE display and interaction. However, taking cost into account, RUIS is the clear winner as it is available for free under the Lesser General Public License (LGPL) Version 3 license.


2016 ◽  
Vol 3 (1) ◽  
Author(s):  
Stefan Stavrev

Education and self-improvement are key features of human behavior. However, learning in the physical world is not always desirable or achievable. That is how simulators came to be. There are domains where purely virtual simulators can be created in contrast to physical ones. In this research we present a novel environment for learning, using a natural user interface. We, humans, are not designed to operate and manipulate objects via keyboard, mouse or a controller. The natural way of interaction and communication is achieved through our actuators (hands and feet) and our sensors (hearing, vision, touch, smell and taste). That is the reason why it makes more sense to use sensors that can track our skeletal movements, are able to estimate our pose, and interpret our gestures. After acquiring and processing the desired – natural input, a system can analyze and translate those gestures into movement signals.


Author(s):  
K-S Hsu ◽  
M-Y Cheng ◽  
M-G Her

For most of the virtual reality system, one of the major aims is to provide a vivid interaction platform between the human operators and the haptic devices. Through the user interface, a skilful operator can control the haptic devices to accomplish relatively complicated jobs in real-time. Generally, the main components of a virtual reality system include dynamic simulations, haptic devices and the user interface, which is composed of virtual environments and visual equipment. This study focuses on developing a virtual tennis entertainment system with haptic behaviour. A parallel-type robot and a serial-type robot are employed as the haptic device handlers in this study, in which they are controlled directly by the operator's arm through the user interface. The operator can sense the change in virtual environment provided by dynamic simulations. In addition, the human operator can ‘see’ the change of environment during operation in real-time through the screen. A virtual spring model and a virtual damper model were constructed to simulate the process of tennis playing in this study. Experimental results verify the feasibility of the proposed virtual tennis entertainment system.


2020 ◽  
Vol 11 (1) ◽  
pp. 99-106
Author(s):  
Marián Hudák ◽  
Štefan Korečko ◽  
Branislav Sobota

AbstractRecent advances in the field of web technologies, including the increasing support of virtual reality hardware, have allowed for shared virtual environments, reachable by just entering a URL in a browser. One contemporary solution that provides such a shared virtual reality is LIRKIS Global Collaborative Virtual Environments (LIRKIS G-CVE). It is a web-based software system, built on top of the A-Frame and Networked-Aframe frameworks. This paper describes LIRKIS G-CVE and introduces its two original components. The first one is the Smart-Client Interface, which turns smart devices, such as smartphones and tablets, into input devices. The advantage of this component over the standard way of user input is demonstrated by a series of experiments. The second component is the Enhanced Client Access layer, which provides access to positions and orientations of clients that share a virtual environment. The layer also stores a history of connected clients and provides limited control over the clients. The paper also outlines an ongoing experiment aimed at an evaluation of LIRKIS G-CVE in the area of virtual prototype testing.


Author(s):  
Sarah Beadle ◽  
Randall Spain ◽  
Benjamin Goldberg ◽  
Mahdi Ebnali ◽  
Shannon Bailey ◽  
...  

Virtual environments and immersive technologies are growing in popularity for human factors purposes. Whether it is training in a low-risk environment or using simulated environments for testing future automated vehicles, virtual environments show promise for the future of our field. The purpose of this session is to have current human factors practitioners and researchers demonstrate their immersive technologies. This is the eighth iteration of the “Me and My VE” interactive session. Presenters in this session will provide a brief introduction of their virtual reality, augmented reality, or virtual environment work before engaging with attendees in an interactive demonstration period. During this period, the presenters will each have a multimedia display of their immersive technology as well as discuss their work and development efforts. The selected demonstrations cover issues of designing immersive interfaces, military and medical training, and using simulation to better understand complex tasks. This includes a mix of government, industry, and academic-based work. Attendees will be virtually immersed in the technologies and research presented allowing for interaction with the work being done in this field.


Sign in / Sign up

Export Citation Format

Share Document