Using handheld user interface and direct manipulation for architectural modeling in immersive virtual reality: An exploratory study

Author(s):  
Hasan Tastan ◽  
Cetin Tuker ◽  
Togan Tong
2019 ◽  
Vol 9 (22) ◽  
pp. 4861 ◽  
Author(s):  
Hind Kharoub ◽  
Mohammed Lataifeh ◽  
Naveed Ahmed

This work presents a novel design of a new 3D user interface for an immersive virtual reality desktop and a new empirical analysis of the proposed interface using three interaction modes. The proposed novel dual-layer 3D user interface allows for user interactions with multiple screens portrayed within a curved 360-degree effective field of view available for the user. Downward gaze allows the user to raise the interaction layer that facilitates several traditional desktop tasks. The 3D user interface is analyzed using three different interaction modes, point-and-click, controller-based direct manipulation, and a gesture-based user interface. A comprehensive user study is performed within a mixed-methods approach for the usability and user experience analysis of all three user interaction modes. Each user interaction is quantitatively and qualitatively analyzed for simple and compound tasks in both standing and seated positions. The crafted mixed approach for this study allows to collect, evaluate, and validate the viability of the new 3D user interface. The results are used to draw conclusions about the suitability of the interaction modes for a variety of tasks in an immersive Virtual Reality 3D desktop environment.


2002 ◽  
Vol 11 (2) ◽  
pp. 119-133 ◽  
Author(s):  
Nicholas R. Hedley ◽  
Mark Billinghurst ◽  
Lori Postner ◽  
Richard May ◽  
Hirokazu Kato

In this paper, we describe two explorations in the use of hybrid user interfaces for collaborative geographic data visualization. Our first interface combines three technologies: augmented reality (AR), immersive virtual reality (VR), and computer vision-based hand and object tracking. Wearing a lightweight display with an attached camera, users can look at a real map and see three-dimensional virtual terrain models overlaid on the map. From this AR interface, they can fly in and experience the model immersively, or use free hand gestures or physical markers to change the data representation. Building on this work, our second interface explores alternative interface techniques, including a zoomable user interface, paddle interactions, and pen annotations. We describe the system hardware and software and the implications for GIS and spatial science applications.


Sign in / Sign up

Export Citation Format

Share Document