Digital Map Table VR: Bringing an Interactive System to Virtual Reality

Author(s):  
Gunnar Strentzsch ◽  
Florian van de Camp ◽  
Rainer Stiefelhagen
Author(s):  
Hao Song ◽  
Fangyuan Chen ◽  
Qingjin Peng ◽  
Jian Zhang ◽  
Peihua Gu

User experience has a significant impact on the effective product design and improvement, especially for a personalized product to meet user’s individual need. The development of personalized products requires data from user experience in the evaluation of the product function and performance. The existing methods of Internet-based interactive platforms and direct market user surveys cannot provide users full experience of product features. This research proposes a user interactive system based on virtual reality technologies to provide users a close-real experience in the development of open-architecture products. The system provides users an interface built on the virtual environment. The users can review a product design by virtually operating and evaluating the product. The system records users’ operations and feedbacks for designers to improve the product. Food trucks designed using the open-architecture concept are used as applications to verify the proposed method. A user survey is conducted to examine the system effectiveness.


Information ◽  
2019 ◽  
Vol 10 (5) ◽  
pp. 170
Author(s):  
Jian Lv ◽  
Xiaoping Xu ◽  
Ning Ding

Aimed at the problem of how to objectively obtain the threshold of a user’s cognitive load in a virtual reality interactive system, a method for user cognitive load quantification based on an eye movement experiment is proposed. Eye movement data were collected in the virtual reality interaction process by using an eye movement instrument. Taking the number of fixation points, the average fixation duration, the average saccade length, and the number of the first mouse clicking fixation points as the independent variables, and the number of backward-looking times and the value of user cognitive load as the dependent variables, a cognitive load evaluation model was established based on the probabilistic neural network. The model was validated by using eye movement data and subjective cognitive load data. The results show that the absolute error and relative mean square error were 6.52%–16.01% and 6.64%–23.21%, respectively. Therefore, the model is feasible.


2014 ◽  
Vol 2014 ◽  
pp. 1-10 ◽  
Author(s):  
Hyungki Kim ◽  
Yuna Kang ◽  
Soonhung Han

Three-dimensional city models are becoming a valuable resource because of their close geospatial, geometrical, and visual relationship with the physical world. However, ground-oriented applications in virtual reality, 3D navigation, and civil engineering require a novel modeling approach, because the existing large-scale 3D city modeling methods do not provide rich visual information at ground level. This paper proposes a new framework for generating 3D city models that satisfy both the visual and the physical requirements for ground-oriented virtual reality applications. To ensure its usability, the framework must be cost-effective and allow for automated creation. To achieve these goals, we leverage a mobile mapping system that automatically gathers high-resolution images and supplements sensor information such as the position and direction of the captured images. To resolve problems stemming from sensor noise and occlusions, we develop a fusion technique to incorporate digital map data. This paper describes the major processes of the overall framework and the proposed techniques for each step and presents experimental results from a comparison with an existing 3D city model.


2019 ◽  
Vol 17 (2) ◽  
pp. 220-235 ◽  
Author(s):  
Bess Krietemeyer ◽  
Amber Bartosh ◽  
Lorne Covington

This article presents the ongoing development and testing of a “shared realities” computational workflow to support iterative user-centered design with an interactive system. The broader aim is to address the challenges associated with observing and recording user interactions within the context of use for improving the performance of an interactive system. A museum installation is used as an initial test bed to validate the following hypothesis: by integrating three-dimensional depth sensing and virtual reality for interaction design and user behavior observations, the shared realities workflow provides an iterative feedback loop that allows for remote observations and recordings for faster and effective decision-making. The methods presented focus on the software development for gestural interaction and user point cloud observations, as well as the integration of virtual reality tools for iterative design of the interface and system performance assessment. Experimental testing demonstrates viability of the shared realities workflow for observing and recording user interaction behaviors and evaluating system performance. Contributions to computational design, technical challenges, and ethical considerations are discussed, as well as directions for future work.


2019 ◽  
Vol 9 (21) ◽  
pp. 4477 ◽  
Author(s):  
Hai Chien Pham ◽  
Nhu-Ngoc Dao ◽  
Sungrae Cho ◽  
Phong Thanh Nguyen ◽  
Anh-Tuan Pham-Hang

Hazard investigation education plays a crucial role in equipping students with adequate knowledge and skills to avoid or eliminate construction hazards at workplaces. With the emergence of various visualization technologies, virtual photoreality as well as 3D virtual reality have been adopted and proved advantageous to various educational disciplines. Despite the significant benefits of providing an engaging and immersive learning environment to promote construction education, recent research has also pointed out that virtual photoreality lacks a 3D object anatomization tools to support learning, while 3D-virtual reality cannot provide a real-world environment. In recent years, research efforts have studied virtual reality applications separately, and there is a lack of research integrating these technologies to overcome limitations and maximize advantages for enhancing learning outcomes. In this regard, the paper develops a construction hazard investigation system leveraging object anatomization on an Interactive Augmented Photoreality platform (iAPR). The proposed iAPR system integrates virtual photoreality with 3D-virtual reality. The iAPR consists of three key learning modules, namely Hazard Understanding Module (HUM), Hazard Recognition Module (HRM), and Safety Performance Module (SPM), which adopt the revised Bloom’s taxonomy theory. A prototype is developed and evaluated objectively through interactive system trials with educators, construction professionals, and learners. The findings demonstrate that the iAPR platform has significant pedagogic methods to improve learner’s construction hazard investigation knowledge and skills, which improve safety performance.


Sign in / Sign up

Export Citation Format

Share Document