scholarly journals Measuring user preferences in virtual reality (VR): 2D versus 3D urban geovisualizations of topographic data

2019 ◽  
Vol 1 ◽  
pp. 1-1
Author(s):  
Alexander J. Kent ◽  
Łukasz Halik

<p><strong>Abstract.</strong> Virtual reality (VR) is a display and control technology that provides an interactive computer-generated three-dimensional environment to a user, often via a Head Mounted Display (HMD). VR delivers an immediate and immersive sensory experience of simulated worlds (which may or may not resemble reality), particularly of environments that might otherwise be physically inaccessible to the user due to their location, scale, time or danger. Although the first VR systems began to emerge in the 1960s, their relevance to cartographic applications has only recently become an explicit focus of research. Moreover, the potential of VR technology to visualize topographic databases has yet to be explored by cartographers.</p><p>In this experiment, we designed a VR application of a fictitious city derived from state 1&amp;thinsp;:&amp;thinsp;10,000 topographic data (Polish Database of Topographic Objects BDOT10k) to test user preferences for 2D or 3D urban geovisualizations. The app allows the user to switch between 2D and 3D representations of buildings in the simulation using a remote controller. This functionality enabled participants of the experiment to freely select 2D or 3D mode and for their preferences to be recorded and measured.</p><p>Our experiment involved two groups, one based in Poland and one in the UK, each comprising 30 participants (students enrolled on a Geography undergraduate course at each author’s institution). Participants performed spatial ability tests to help ensure consistency in the sample and each group was divided into two sub-groups. Participants in the first sub-group were each given a navigation task that required their movement across the simulated city from point A to point B in the shortest possible time. Those in the second sub-group were given the freedom to explore the simulated city without being given a specific navigational task. We then interviewed participants in order to understand their own perception of their experiences in using the app.</p><p>The results indicate the preferences of the two groups and sub-groups of participants. In particular, we establish whether users preferred the 2D mode for the navigational task and the 3D mode for free exploration. The findings suggest how producers of topographic datasets might develop the functionality of their products using VR.</p>

2020 ◽  
Vol 10 (5) ◽  
pp. 1668 ◽  
Author(s):  
Pavan Kumar B. N. ◽  
Adithya Balasubramanyam ◽  
Ashok Kumar Patil ◽  
Chethana B. ◽  
Young Ho Chai

Over the years, gaze input modality has been an easy and demanding human–computer interaction (HCI) method for various applications. The research of gaze-based interactive applications has advanced considerably, as HCIs are no longer constrained to traditional input devices. In this paper, we propose a novel immersive eye-gaze-guided camera (called GazeGuide) that can seamlessly control the movements of a camera mounted on an unmanned aerial vehicle (UAV) from the eye-gaze of a remote user. The video stream captured by the camera is fed into a head-mounted display (HMD) with a binocular eye tracker. The user’s eye-gaze is the sole input modality to maneuver the camera. A user study was conducted considering the static and moving targets of interest in a three-dimensional (3D) space to evaluate the proposed framework. GazeGuide was compared with a state-of-the-art input modality remote controller. The qualitative and quantitative results showed that the proposed GazeGuide performed significantly better than the remote controller.


2014 ◽  
Vol 7 (1) ◽  
Author(s):  
Claudio Pensieri ◽  
Maddalena Pennacchini

Background: Virtual Reality (VR) was defined as a collection of technological devices: “a computer capable of interactive 3D visualization, a head-mounted display and data gloves equipped with one or more position trackers”. Today, lots of scientists define VR as a simulation of the real world based on computer graphics, a three dimensional world in which communities of real people interact, create content, items and services, producing real economic value through e-Commerce.Objective: To report the results of a systematic review of articles and reviews published about the theme: “Virtual Reality in Medicine”.Methods: We used the search query string: “Virtual Reality”, “Metaverse”, “Second Life”, “Virtual World”, “Virtual Life” in order to find out how many articles were written about these themes. For the “Meta-review” we used only “Virtual Reality” AND “Review”. We searched the following databases: Psycinfo, Journal of Medical Internet Research, Isiknowledge till September 2011 and Pubmed till February 2012. We included any source published in either print format or on the Internet, available in all languages, and containing texts that define or attempt to define VR in explicit terms.Results: We retrieved 3,443 articles on Pubmed in 2012 and 8,237 on Isiknowledge in 2011. This large number of articles covered a wide range of themes, but showed no clear consensus about VR. We identified 4 general uses of VR in Medicine, and searched for the existing reviews about them. We found 364 reviews in 2011, although only 197 were pertinent to our aims: 1. Communication Interface (11 Reviews); 2. Medical Education (49 reviews); 3. Surgical Simulation (49 Reviews) and 4. Psychotherapy (88 Reviews).Conclusion: We found a large number of articles, but no clear consensus about the meaning of the term VR in Medicine. We found numerous articles published on these topics and many of them have been reviewed. We decided to group these reviews in 4 areas in order to provide a systematic overview of the subject matter, and to enable those interested to learn more about these particular topics.


1992 ◽  
Vol 1 (1) ◽  
pp. 45-62 ◽  
Author(s):  
Warren Robinett ◽  
Jannick P. Rolland

For stereoscopic photography or telepresence, orthostereoscopy occurs when the perceived size, shape, and relative position of objects in the three-dimensional scene being viewed match those of the physical objects in front of the camera. In virtual reality, the simulated scene has no physical counterpart, so orthostereoscopy must be defined in this case as constancy, as the head moves around, of the perceived size, shape, and relative positions of the simulated objects. Achieving this constancy requires that the computational model used to generate the graphics matches the physical geometry of the head-mounted display being used. This geometry includes the optics used to image the displays and the placement of the displays with respect to the eyes. The model may fail to match the geometry because model parameters are difficult to measure accurately, or because the model itself is in error. Two common modeling errors are ignoring the distortion caused by the optics and ignoring the variation in interpupillary distance across different users. A computational model for the geometry of a head-mounted display is presented, and the parameters of this model for the VPL EyePhone are calculated.


2022 ◽  
Vol 27 ◽  
pp. 48-69
Author(s):  
Sahar Y. Ghanem

As the industry transitions towards incorporating BIM in construction projects, adequately qualified students and specialists are essential to this transition. It became apparent that construction management programs required integrating Building Information Modeling (BIM) into the curriculum. By bringing Virtual Reality (VR) technology to BIM, VR-BIM would transform the architectural, engineering, and construction (AEC) industry, and three-dimensional (3D) immersive learning can be a valuable platform to enhance students' ability to recognize a variety of building principles. The study carries out a methodology for implementing the VR-BIM in the construction management undergraduate program. Based on the previous literature review, in-depth analysis of the program, and accreditation requirements, VR-BIM will be implemented throughout the curriculum by combining stand-alone class and integration in the existing courses method. The challenges that may face the program planning to implement VR-BIM are discussed, and few solutions are proposed. The lab classroom layout appropriate for the applications is designed to be adjusted for several layouts to accommodate all learning styles and objectives. A comparison between different Head-Mounted Display (HMD) headsets is carried out to choose the appropriate equipment for the lab.


Author(s):  
Vishant J. Shahnawaz ◽  
Judy M. Vance ◽  
Sasikumar V. Kutti

Abstract This paper discusses the development of a virtual reality (VR) interface for the visualization of Computational Fluid Dynamics (CFD) data. The application, VR-CFD, provides an immersive and interactive graphical environment in which users can examine the analysis results from a CFD analysis of a flow field in three-dimensional space. It has been tested and implemented with virtual reality devices such as the C2, head mounted display (HMD) and desktop VR. The application is designed to read PLOT3D structured grid data and to display the flow field parameters using features such as streamlines, cutting planes, iso-surfaces, rakes, vector fields and scalar fields. Visualization Toolkit (VTK), a data visualization library, is used along with OpenGL arid the C2 VR interface libraries, to develop the application. Analysts and designers have used VR-CFD to visualize and understand complex three-dimensional fluid flow phenomena. The combination of three-dimensional interaction capability and the C2 virtual reality environment has been shown to facilitate collaborative discussions between analysts and engineers concerning the appropriateness of the CFD model and the characteristics of the fluid flow.


2018 ◽  
Author(s):  
Yoshihito Masuoka ◽  
Hiroyuki Morikawa ◽  
Takashi Kawai ◽  
Toshio Nakagohri

BACKGROUND Virtual reality (VR) technology has started to gain attention as a form of surgical support in medical settings. Likewise, the widespread use of smartphones has resulted in the development of various medical applications; for example, Google Cardboard, which can be used to build simple head-mounted displays (HMDs). However, because of the absence of observed and reported outcomes of the use of three-dimensional (3D) organ models in relevant environments, we have yet to determine the effects of or issues with the use of such VR technology. OBJECTIVE The aim of this paper was to study the issues that arise while observing a 3D model of an organ that is created based on an actual surgical case through the use of a smartphone-based simple HMD. Upon completion, we evaluated and gathered feedback on the performance and usability of the simple observation environment we had created. METHODS We downloaded our data to a smartphone (Galaxy S6; Samsung, Seoul, Korea) and created a simple HMD system using Google Cardboard (Google). A total of 17 medical students performed 2 experiments: an observation conducted by a single observer and another one carried out by multiple observers using a simple HMD. Afterward, they assessed the results by responding to a questionnaire survey. RESULTS We received a largely favorable response in the evaluation of the dissection model, but also a low score because of visually induced motion sickness and eye fatigue. In an introspective report on simultaneous observations made by multiple observers, positive opinions indicated clear image quality and shared understanding, but displeasure caused by visually induced motion sickness, eye fatigue, and hardware problems was also expressed. CONCLUSIONS We established a simple system that enables multiple persons to observe a 3D model. Although the observation conducted by multiple observers was successful, problems likely arose because of poor smartphone performance. Therefore, smartphone performance improvement may be a key factor in establishing a low-cost and user-friendly 3D observation environment.


2017 ◽  
Author(s):  
Shaun W Jerdan ◽  
Mark Grindle ◽  
Hugo C van Woerden ◽  
Maged N Kamel Boulos

BACKGROUND eHealth interventions are becoming increasingly used in public health, with virtual reality (VR) being one of the most exciting recent developments. VR consists of a three-dimensional, computer-generated environment viewed through a head-mounted display. This medium has provided new possibilities to adapt problematic behaviors that affect mental health. VR is no longer unaffordable for individuals, and with mobile phone technology being able to track movements and project images through mobile head-mounted devices, VR is now a mobile tool that can be used at work, home, or on the move. OBJECTIVE In line with recent advances in technology, in this review, we aimed to critically assess the current state of research surrounding mental health. METHODS We compiled a table of 82 studies that made use of head-mounted devices in their interventions. RESULTS Our review demonstrated that VR is effective in provoking realistic reactions to feared stimuli, particularly for anxiety; moreover, it proved that the immersive nature of VR is an ideal fit for the management of pain. However, the lack of studies surrounding depression and stress highlight the literature gaps that still exist. CONCLUSIONS Virtual environments that promote positive stimuli combined with health knowledge could prove to be a valuable tool for public health and mental health. The current state of research highlights the importance of the nature and content of VR interventions for improved mental health. While future research should look to incorporate more mobile forms of VR, a more rigorous reporting of VR and computer hardware and software may help us understand the relationship (if any) between increased specifications and the efficacy of treatment.


Author(s):  
Wei Zhu ◽  
Shaoyu Guo ◽  
Jinhua Zhao

Immersive virtual reality is a promising technology for planning participation. The paper contributes to the literature by comparing the latest virtual reality technology using head-mounted display with conventional graphic representation (pictures of rendered three-dimensional environments in this case) in terms of the effects on the participants’ preferences for the plans and their underlying decision mechanisms. Using a stated choice experiment based on a real-world project of street renewal, we collected choice data from 48 university students from non-design majors. We found significant quantitative but limited qualitative differences between the aggregate preferences under virtual reality and conventional graphic representation, and some generally unappealing features under conventional graphic representation were more favored under virtual reality. Results of the discrete choice modeling showed the individual decision mechanisms became more homogeneous under virtual reality. Virtual reality had stronger impacts on the female participants than the male participants. The females had more aggregate preference reversals, larger preference differences, and stronger changes in the decision mechanism. But the mechanisms of the two genders converged under virtual reality. The findings can be used to design better participatory processes with virtual reality and conventional graphic representation properly applied according to their capabilities.


Sign in / Sign up

Export Citation Format

Share Document