scholarly journals A Comparison of Gesture and Controller-based User Interfaces for 3D Design Reviews in Virtual Reality

2022 ◽  
Author(s):  
Taneli Nyyssönen ◽  
Seppo Helle ◽  
Teijo Lehtonen ◽  
Jouni Smed
2021 ◽  
Vol 5 (EICS) ◽  
pp. 1-26
Author(s):  
Carlos Bermejo ◽  
Lik Hang Lee ◽  
Paul Chojecki ◽  
David Przewozny ◽  
Pan Hui

The continued advancement in user interfaces comes to the era of virtual reality that requires a better understanding of how users will interact with 3D buttons in mid-air. Although virtual reality owns high levels of expressiveness and demonstrates the ability to simulate the daily objects in the physical environment, the most fundamental issue of designing virtual buttons is surprisingly ignored. To this end, this paper presents four variants of virtual buttons, considering two design dimensions of key representations and multi-modal cues (audio, visual, haptic). We conduct two multi-metric assessments to evaluate the four virtual variants and the baselines of physical variants. Our results indicate that the 3D-lookalike buttons help users with more refined and subtle mid-air interactions (i.e. lesser press depth) when haptic cues are available; while the users with 2D-lookalike buttons unintuitively achieve better keystroke performance than the 3D counterparts. We summarize the findings, and accordingly, suggest the design choices of virtual reality buttons among the two proposed design dimensions.


Author(s):  
Randall Spain ◽  
Jason Saville ◽  
Barry Lui ◽  
Donia Slack ◽  
Edward Hill ◽  
...  

Because advances in broadband capabilities will soon allow first responders to access and use many forms of data when responding to emergencies, it is becoming critically important to design heads-up displays to present first responders with information in a manner that does not induce extraneous mental workload or cause undue interaction errors. Virtual reality offers a unique medium for envisioning and testing user interface concepts in a realistic and controlled environment. In this paper, we describe a virtual reality-based emergency response scenario that was designed to support user experience research for evaluating the efficacy of intelligent user interfaces for firefighters. We describe the results of a usability test that captured firefighters’ feedback and reactions to the VR scenario and the prototype intelligent user interface that presented them with task critical information through the VR headset. The paper concludes with lessons learned from our development process and a discussion of plans for future research.


2021 ◽  
pp. 59-80
Author(s):  
Benjamin Knoke ◽  
◽  
Moritz Quandt ◽  
Michael Freitag ◽  
Klaus-Dieter Thoben

The purpose of this research is to aggregate and discuss the validity of challenges and design guidelines regarding industrial Virtual Reality (VR) training applications. Although VR has seen significant advancements in the last 20 years, the technology still faces multiple research challenges. The challenges towards industrial VR applications are imposed by a limited technological maturity and the need to achieve industrial stakeholders' technology acceptance. Technology acceptance is closely connected with the consideration of individual user requirements for user interfaces in virtual environments. This paper analyses the current state-of-the-art in industrial VR applications and provides a structured overview of the existing challenges and applicable guidelines for user interface design, such as ISO 9241-110. The validity of the identified challenges and guidelines is discussed against an industrial training scenario on electrical safety during maintenance tasks.


2021 ◽  
Vol 58 (3) ◽  
pp. 137-142
Author(s):  
A.O. Dauitbayeva ◽  
◽  
A.A. Myrzamuratova ◽  
A.B. Bexeitova ◽  
◽  
...  

This article is devoted to the issues of visualization and information processing, in particular, improving the visualization of three-dimensional objects using augmented reality and virtual reality technologies. The globalization of virtual reality has led to the introduction of a new term "augmented reality"into scientific circulation. If the current technologies of user interfaces are focused mainly on the interaction of a person and a computer, then augmented reality with the help of computer technologies offers improving the interface of a person and the real world around them. Computer graphics are perceived by the system in the synthesized image in connection with the reproduction of monocular observation conditions, increasing the image volume, spatial arrangement of objects in a linear perspective, obstructing one object to another, changing the nature of shadows and tones in the image field. The experience of observation is of great importance for the perception of volume and space, so that the user "completes" the volume structure of the observed representation. Thus, the visualization offered by augmented reality in a real environment familiar to the user contributes to a better perception of three-dimensional object.


Author(s):  
Tushar H. Dani ◽  
Rajit Gadh

Abstract Despite advances in Computer-Aided Design (CAD) and the evolution of the graphical user interfaces, rapid creation, editing and visualization of three-dimensional (3D) shapes remains a tedious task. Though the availability of Virtual Reality (VR)-based systems allows enhanced three-dimensional interaction and visualization, the use of VR for ab initio shape design, as opposed to ‘importing’ models from existing CAD systems, is a relatively new area of research. Of interest are computer-human interaction issues and the design and geometric tools for shape modeling in a Virtual Environment (VE). The focus of this paper is on the latter i.e. in defining the geometric tools required for a VR-CAD system and in describing a framework that meets those requirements. This framework, the Virtual Design Software Framework (VDSF) consists of the interaction and design tools, and an underlying geometric engine that provides the representation and algorithms required by these tools. The geometric engine called the Virtual Modeler uses a graph-based representation (Shape-Graph) for modeling the shapes created by the user. The Shape-Graph facilitates interactive editing by localizing the effect of editing operations and in addition provides constraint-based design and editing mechanisms that are useful in a 3D interactive virtual environment. The paper concludes with a description of the prototype system, called the Virtual Design Studio (VDS), that is currently being implemented.1.


2018 ◽  
pp. 119-137
Author(s):  
Alan Radley

A new philosophy of user interface design is described. Named the “Lookable User Interface,” or LUI, the approach is based on the concept of a Personal Reality (PR) system. Here the computer adapts to the user's worldview in a personalized way, and according to the specific requirements, behaviors, and perceptive skills of the individual. Typically, a PR system creates and adjusts (in real-time) 3D perspective view(s) of a data-set, including (potentially) the field of view of a scene and the apparent distance and scale of objects, whilst also creating an aesthetic “eye-friendly” context for computing operations. A Lookable User Interface (LUI) affords the maximum degree of visual accessibility to digital content. The authors examine the results of testing a Lookable User Interface. Spectasia is one example of a Personal Virtual Reality (PVR) that can be used to visualize links between universals and particulars within digital worlds.


2010 ◽  
pp. 180-193 ◽  
Author(s):  
F. Steinicke ◽  
G. Bruder ◽  
J. Jerald ◽  
H. Frenz

In recent years virtual environments (VEs) have become more and more popular and widespread due to the requirements of numerous application areas in particular in the 3D city visualization domain. Virtual reality (VR) systems, which make use of tracking technologies and stereoscopic projections of three-dimensional synthetic worlds, support better exploration of complex datasets. However, due to the limited interaction space usually provided by the range of the tracking sensors, users can explore only a portion of the virtual environment (VE). Redirected walking allows users to walk through large-scale immersive virtual environments (IVEs) such as virtual city models, while physically remaining in a reasonably small workspace by intentionally injecting scene motion into the IVE. With redirected walking users are guided on physical paths that may differ from the paths they perceive in the virtual world. The authors have conducted experiments in order to quantify how much humans can unknowingly be redirected. In this chapter they present the results of this study and the implications for virtual locomotion user interfaces that allow users to view arbitrary real world locations, before the users actually travel there in a natural environment.


Author(s):  
Alan Radley

A new philosophy of user interface design is described. Named the “Lookable User Interface,” or LUI, the approach is based on the concept of a Personal Reality (PR) system. Here the computer adapts to the user's worldview in a personalized way, and according to the specific requirements, behaviors, and perceptive skills of the individual. Typically, a PR system creates and adjusts (in real-time) 3D perspective view(s) of a data-set, including (potentially) the field of view of a scene and the apparent distance and scale of objects, whilst also creating an aesthetic “eye-friendly” context for computing operations. A Lookable User Interface (LUI) affords the maximum degree of visual accessibility to digital content. The authors examine the results of testing a Lookable User Interface. Spectasia is one example of a Personal Virtual Reality (PVR) that can be used to visualize links between universals and particulars within digital worlds.


Sign in / Sign up

Export Citation Format

Share Document