scholarly journals Landmarks: A solution for spatial navigation and memory experiments in virtual reality

Author(s):  
Michael J. Starrett ◽  
Andrew S. McAvan ◽  
Derek J. Huffman ◽  
Jared D. Stokes ◽  
Colin T. Kyle ◽  
...  

Abstract Research into the behavioral and neural correlates of spatial cognition and navigation has benefited greatly from recent advances in virtual reality (VR) technology. Devices such as head-mounted displays (HMDs) and omnidirectional treadmills provide research participants with access to a more complete range of body-based cues, which facilitate the naturalistic study of learning and memory in three-dimensional (3D) spaces. One limitation to using these technologies for research applications is that they almost ubiquitously require integration with video game development platforms, also known as game engines. While powerful, game engines do not provide an intrinsic framework for experimental design and require at least a working proficiency with the software and any associated programming languages or integrated development environments (IDEs). Here, we present a new asset package, called Landmarks, for designing and building 3D navigation experiments in the Unity game engine. Landmarks combines the ease of building drag-and-drop experiments using no code, with the flexibility of allowing users to modify existing aspects, create new content, and even contribute their work to the open-source repository via GitHub, if they so choose. Landmarks is actively maintained and is supplemented by a wiki with resources for users including links, tutorials, videos, and more. We compare several alternatives to Landmarks for building navigation experiments and 3D experiments more generally, provide an overview of the package and its structure in the context of the Unity game engine, and discuss benefits relating to the ongoing and future development of Landmarks.

2021 ◽  
Author(s):  
◽  
Mohsin Ali

<p>The technology of today, such as the Oculus Rift, can provide immersion in ways that were unachievable in the past. The Oculus Rift is a virtual reality headset that allows the user to see the three-dimensional world without the use of a traditional monitor. Unlike television, computer and mobile screens, a virtual reality headset digitally transports the user into the environment. Functionality such as depth tracking and rotational head tracking provides immersion unlike anything experienced to date.   My interest is to investigate interactive storytelling in combination with the Oculus Rift, to determine if virtual reality headsets can enrich storytelling experiences. This will be achieved by developing an application where interactive storytelling is compatible with the Oculus Rift, and testing that application with participants. Finally, a conclusion will be drawn from the data collected by participants.   Alongside the written thesis, a digital application will be produced in Unreal Engine 4 (Video game engine). The application will be an Oculus Rift driven experience, meaning that users can only experience it through an Oculus Rift. The application will have an interactive plot, which allows the user to influence the storyline. The design will be iterative and will be refined after each user testing session. The application hopes to strengthen the theories and concepts found in the written section of the thesis.</p>


Author(s):  
Cecile Meier ◽  
Jose Luis Saorín ◽  
Alejandro Bonnet de León ◽  
Alberto Guerrero Cobos

This paper describes an experience to incorporate the realization of virtual routes about the sculptural heritage of a city in the classroom by developing a simulation of the urban environment using a video game engine. Video game engines not only allow the creation of video games but also the creation and navigation of in-teractive three-dimensional worlds. For this research, Roblox Studio has been used, a simple and intuitive program in which no previous programming skills are required. During the 2018/2019 academic year, a pilot experience was carried out with 53 secondary school students who were given the task of designing a virtual environment in which they had to include 3D models of the sculptural her-itage of the city of Santa Cruz de Tenerife. Before starting the experience, the par-ticipants answered a questionnaire to obtain a previous idea of the students' knowledge about the creation of video games. Once the activity was finished and in order to evaluate the result of the activity, the participants answered a final questionnaire. The students emphasized that after the activity they are more aware of the sculptural heritage of Santa Cruz and that they consider themselves capable of creating their own interactive worlds with Roblox.


2021 ◽  
Author(s):  
◽  
Mohsin Ali

<p>The technology of today, such as the Oculus Rift, can provide immersion in ways that were unachievable in the past. The Oculus Rift is a virtual reality headset that allows the user to see the three-dimensional world without the use of a traditional monitor. Unlike television, computer and mobile screens, a virtual reality headset digitally transports the user into the environment. Functionality such as depth tracking and rotational head tracking provides immersion unlike anything experienced to date.   My interest is to investigate interactive storytelling in combination with the Oculus Rift, to determine if virtual reality headsets can enrich storytelling experiences. This will be achieved by developing an application where interactive storytelling is compatible with the Oculus Rift, and testing that application with participants. Finally, a conclusion will be drawn from the data collected by participants.   Alongside the written thesis, a digital application will be produced in Unreal Engine 4 (Video game engine). The application will be an Oculus Rift driven experience, meaning that users can only experience it through an Oculus Rift. The application will have an interactive plot, which allows the user to influence the storyline. The design will be iterative and will be refined after each user testing session. The application hopes to strengthen the theories and concepts found in the written section of the thesis.</p>


2020 ◽  
Vol 10 (2) ◽  
pp. 597 ◽  
Author(s):  
Ovidia Soto-Martin ◽  
Alba Fuentes-Porto ◽  
Jorge Martin-Gutierrez

Nowadays, virtual reality technologies and immersive virtual reality (VR) apps allow people to view, explore, engage with and learn about historic monuments and buildings, historic sites, and even historic scenes. To preserve our cultural heritage for future generations. it is essential that damaged and dilapidated historic artefacts are accurately documented, and that steps are taken to improve user experiences in the areas of virtual visits, science and education. This paper describes an approach to reconstruct and restore historic buildings and mural paintings. The work process uses digital models that are then inserted into an interactive and immersive VR environment. Windows-Mixed Reality is used to visualize the said VR environment. The work method was applied at a United Nations Educational, Scientific and Cultural Organisation (UNESCO) World Heritage Site in Tenerife (Canary Islands, Spain), thereby creating a virtual three dimensional (3D) rendering of the architectural structures of the St Augustine Church in La Laguna and its murals. A combination of topography and terrestrial photogrammetry was used to reconstruct its architectural features, and the digital imaging tool DStretch® to recover its murals. The resulting 3D model was then inserted into an immersive and interactive VR environment created using the cross-platform game engine Unity. One of the greatest challenges of this project revolved around recovering the dilapidated and virtually nonexistent mural paintings using DStretch®. However, the final result is an immersive and interactive VR environment containing architectural and artistic information created within the video game engine Unity, which thereby allows the user to explore, observe and interact with a cultural heritage site in real time.


2020 ◽  
Author(s):  
Simone Grassini ◽  
Karin Laumann ◽  
Ann Kristin Luzi

Many studies have attempted to understand which individual differences may be related to the symptoms of discomfort during the virtual experience (simulator sickness) and the generally considered positive sense of being inside the simulated scene (sense of presence). Nevertheless, due to the quick technological advancement in the field of virtual reality, most of these studies are now outdated. Advanced technology for virtual reality is commonly mediated by head-mounted displays (HMDs), which aim to increase the sense of the presence of the user, remove stimuli from the external environment, and provide high definition, photo-realistic, three-dimensional images. Our results showed that motion sickness susceptibility and simulator sickness are related and neuroticism may be associated and predict simulator sickness. Furthermore, the results showed that people who are more used to playing video-games are less susceptible to simulator sickness; female participants reported more simulator sickness compared to males (but only for nausea-related symptoms). Female participants also experienced a higher sense of presence compared to males. We suggests that published findings on simulator sickness and the sense of presence in virtual reality environments need to be replicated with the use of modern HMDs.


2020 ◽  
Vol 79 (17-18) ◽  
pp. 12307-12328
Author(s):  
Miguel Chover ◽  
Carlos Marín ◽  
Cristina Rebollo ◽  
Inmaculada Remolar

2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
Dario Maggiorini ◽  
Laura Anna Ripamonti ◽  
Federico Sauro

Video games are (also) real-time interactive graphic simulations: hence, providing a convincing physics simulation for each specific game environment is of paramount importance in the process of achieving a satisfying player experience. While the existing game engines appropriately address many aspects of physics simulation, some others are still in need of improvements. In particular, several specific physics properties of bodies not usually involved in the main game mechanics (e.g., properties useful to represent systems composed by soft bodies), are often poorly rendered by general-purpose engines. This issue may limit game designers when imagining innovative and compelling video games and game mechanics. For this reason, we dug into the problem of appropriately representing soft bodies. Subsequently, we have extended the approach developed for soft bodies to rigid ones, proposing and developing a unified approach in a game engine: Sulfur. To test the engine, we have also designed and developed “Escape from Quaoar,” a prototypal video game whose main game mechanic exploits an elastic rope, and a level editor for the game.


Author(s):  
Julian Keil ◽  
Dennis Edler ◽  
Thomas Schmitt ◽  
Frank Dickmann

AbstractModern game engines like Unity allow users to create realistic 3D environments containing terrains as well as natural and artificial objects easily and swiftly. In addition, recent advances of game engine capabilities enable effortless implementation of virtual reality (VR) compatibility. 3D environments created with VR compatibility can be experienced from an egocentric and stereoscopic perspective that surpasses the immersion of the ‘classical’ screen-based perception of 3D environments. Not only game developers benefit from the possibilities provided by game engines. The ability to use geospatial data to shape virtual 3D environments opens a multitude of possibilities for geographic applications, such as construction planning, spatial hazard simulations or representation of historical places. The multi-perspective, multimodal reconstruction of three-dimensional space based on game engine technology today supports the possibility of linking different approaches of geographic work more closely. Free geospatial data that can be used for spatial reconstructions is provided by numerous national and regional official institutions. However, the file format of these data sources is not standardized and game engines only support a limited number of file formats. Therefore, format transformation is usually required to apply geospatial data to virtual 3D environments. This paper presents several workflows to apply digital elevation data and 3D city model data from OpenStreetMap and the Open.NRW initiative to Unity-based 3D environments. Advantages and disadvantages of different sources of geospatial data are discussed. In addition, implementation of VR compatibility is described. Finally, benefits of immersive VR implementation and characteristics of current VR hardware are discussed in the context of specific geographic application scenarios.


2018 ◽  
Author(s):  
Marcus. R. Watson ◽  
Voloh Benjamin ◽  
Thomas Christopher ◽  
Hasan Asif ◽  
Womelsdorf Thilo

1Abstract1.1BackgroundThere is a growing interest in complex, active, and immersive behavioral neuroscience tasks. However, the development and control of such tasks present unique challenges.1.2New MethodThe Unified Suite for Experiments (USE) is an integrated set of hardware and software tools for the design and control of behavioral neuroscience experiments. The software, developed using the Unity video game engine, supports both active tasks in immersive 3D environments and static 2D tasks used in more traditional visual experiments. The custom USE SyncBox hardware, based around an Arduino Mega2560 board, integrates and synchronizes multiple data streams from different pieces of experimental hardware. The suite addresses three key issues with developing cognitive neuroscience experiments in Unity: tight experimental control, accurate sub-ms timing, and accurate gaze target identification.1.3ResultsUSE is a flexible framework to realize experiments, enabling (i) nested control over complex tasks, (ii) flexible use of 3D or 2D scenes and objects, (iii) touchscreen-, button-, joystick- and gaze-based interaction, and (v) complete offline reconstruction of experiments for post-processing and temporal alignment of data streams.1.4Comparison with Existing MethodsMost existing experiment-creation tools are not designed to support the development of video-game-like tasks. Those that do use older or less popular video game engines as their base, and are not as feature-rich or enable as precise control over timing as USE.1.5ConclusionsUSE provides an integrated, open source framework for a wide variety of active behavioral neuroscience experiments using human and nonhuman participants, and artificially-intelligent agents.2GlossaryActive task: Experimental tasks which involve some combination of realistic, usually moving, stimuli, continuous opportunities for action, ecologically valid tasks, complex behaviours, etc. Here, they are contrasted with static tasks (see below)Arduino: A multi-purpose generic micro-processor, here used to control inter-device communication and time synchronization.Raycast: A game-engine method that sends a vector between two points in a virtual three-dimensional environment, and returns the first object in that environment it hits. Often used to determine if a character in a game can see or shoot another character.State Machine (also Finite State Machine): A way of conceptualizing and implementing control in software, such that at any one moment the software is in one, and only one, state. In hierarchical state machines, as used in the present software suite, these can be organized into different levels, such that each level can only be in one state, but a state can pass control to a lower level.Static task: Experimental tasks like those traditionally used in the cognitive neurosciences. Simple, usually stationary, stimuli, limited opportunities for action, simple behaviours, etc. Here, they are contrasted with active tasks (see above).Unity: One of the most popular video game engines. Freely available.Video game engine: A software development kit designed to handle many of the common issues involved in creating video games, such as interfacing with controllers, simulating physical collisions and lighting, etc.


2020 ◽  
Author(s):  
Simone Grassini ◽  
Karin Laumann ◽  
Ann Kristin Luzi

Many studies have attempted to understand which individual differences may be related to the symptoms of discomfort during the virtual experience (simulator sickness) and the generally considered positive sense of being inside the simulated scene (sense of presence). Nevertheless, due to the quick technological advancement in the field of virtual reality, most of these studies are now outdated. Advanced technology for virtual reality is commonly mediated by head-mounted displays (HMDs), which aim to increase the sense of the presence of the user, remove stimuli from the external environment, and provide high definition, photo-realistic, three-dimensional images. Our results showed that motion sickness susceptibility and simulator sickness are related and neuroticism may be associated and predict simulator sickness. Furthermore, the results showed that people who are more used to playing video-games are less susceptible to simulator sickness; female participants reported more simulator sickness compared to males (but only for nausea-related symptoms). Female participants also experienced a higher sense of presence compared to males. We suggests that published findings on simulator sickness and the sense of presence in virtual reality environments need to be replicated with the use of modern HMDs.


Sign in / Sign up

Export Citation Format

Share Document