scholarly journals UNDERSTANDING UNDERWATER PHOTOGRAMMETRY FOR MARITIME ARCHAEOLOGY THROUGH IMMERSIVE VIRTUAL REALITY

Author(s):  
M. Doležal ◽  
M. Vlachos ◽  
M. Secci ◽  
S. Demesticha ◽  
D. Skarlatos ◽  
...  

<p><strong>Abstract.</strong> Underwater archaeological discoveries bring new challenges to the field, but such sites are more difficult to reach and, due to natural influences, they tend to deteriorate fast. Photogrammetry is one of the most powerful tools used for archaeological fieldwork. Photogrammetric techniques are used to document the state of the site in digital form for later analysis, without the risk of damaging any of the artefacts or the site itself. To achieve best possible results with the gathered data, divers should come prepared with the knowledge of measurements and photo capture methods. Archaeologists use this technology to record discovered arteacts or even the whole archaeological sites. Data gathering underwater brings several problems and limitations, so specific steps should be taken to get the best possible results, and divers should well be prepared before starting work at an underwater site. Using immersive virtual reality, we have developed an educational software to introduce maritime archaeology students to photogrammetry techniques. To test the feasibility of the software, a user study was performed and evaluated by experts. In the software, the user is tasked to put markers on the site, measure distances between them, and then take photos of the site, from which the 3D mesh is generated offline. Initial results show that the system is useful for understanding the basics of underwater photogrammetry.</p>

Author(s):  
Kevin Lesniak ◽  
Conrad S. Tucker ◽  
Sven Bilen ◽  
Janis Terpenny ◽  
Chimay Anumba

Immersive virtual reality systems have the potential to transform the manner in which designers create prototypes and collaborate in teams. Using technologies such as the Oculus Rift or the HTC Vive, a designer can attain a sense of “presence” and “immersion” typically not experienced by traditional CAD-based platforms. However, one of the fundamental challenges of creating a high quality immersive virtual reality experience is actually creating the immersive virtual reality environment itself. Typically, designers spend a considerable amount of time manually designing virtual models that replicate physical, real world artifacts. While there exists the ability to import standard 3D models into these immersive virtual reality environments, these models are typically generic in nature and do not represent the designer’s intent. To mitigate these challenges, the authors of this work propose the real time translation of physical objects into an immersive virtual reality environment using readily available RGB-D sensing systems and standard networking connections. The emergence of commercial, off-the shelf RGB-D sensing systems such as the Microsoft Kinect, have enabled the rapid 3D reconstruction of physical environments. The authors present a methodology that employs 3D mesh reconstruction algorithms and real time rendering techniques to capture physical objects in the real world and represent their 3D reconstruction in an immersive virtual realilty environment with which the user can then interact. A case study involving a commodity RGB-D sensor and multiple computers connected through standard TCP internet connections is presented to demonstrate the viability of the proposed methodology.


2021 ◽  
Vol 11 (24) ◽  
pp. 11613
Author(s):  
Agapi Chrysanthakopoulou ◽  
Konstantinos Kalatzis ◽  
Konstantinos Moustakas

Virtual reality (VR) and 3D modeling technologies have become increasingly powerful tools for multiple fields, such as education, architecture, and cultural heritage. Museums are no longer places for only placing and exhibiting collections and artworks. They use such technologies to offer a new way of communicating art and history with their visitors. In this paper, we present the initial results of a proposed workflow towards highlighting and interpreting a historic event with the use of an immersive and interactive VR experience and the utilization of multiple senses of the user. Using a treadmill for navigating and haptic gloves for interacting with the environment, combined with the detailed 3D models, deepens the sense of immersion. The results of our study show that engaging multiple senses and visual manipulation in an immersive 3D environment can effectively enhance the perception of visual realism and evoke a stronger sense of presence, amplifying the educational and informative experience in a museum.


2011 ◽  
Vol 20 (1) ◽  
pp. 78-92 ◽  
Author(s):  
Samantha Finkelstein ◽  
Evan A. Suma

We present the design and evaluation of Astrojumper, an immersive virtual reality exergame developed to motivate players to engage in rigorous, full-body exercise. We performed a user study with 30 people between the ages of 6 and 50 who played the game for 15 min. Regardless of differences in age, gender, activity level, and video game experience, participants rated Astrojumper extremely positively and experienced a significant increase in heart rate after gameplay. Additionally, we found that participants' ratings of perceived workout intensity positively correlated with their level of motivation. Overall, our results demonstrate that Astrojumper effectively motivates both children and adults to exercise through immersive virtual reality technology and a simple, yet engaging, game design.


2017 ◽  
Vol 11 (46) ◽  
pp. 1-8
Author(s):  
Diana Marcela Robayo Calderon ◽  
Diego Mauricio Rivera ◽  
◽  

2019 ◽  
Vol 9 (22) ◽  
pp. 4861 ◽  
Author(s):  
Hind Kharoub ◽  
Mohammed Lataifeh ◽  
Naveed Ahmed

This work presents a novel design of a new 3D user interface for an immersive virtual reality desktop and a new empirical analysis of the proposed interface using three interaction modes. The proposed novel dual-layer 3D user interface allows for user interactions with multiple screens portrayed within a curved 360-degree effective field of view available for the user. Downward gaze allows the user to raise the interaction layer that facilitates several traditional desktop tasks. The 3D user interface is analyzed using three different interaction modes, point-and-click, controller-based direct manipulation, and a gesture-based user interface. A comprehensive user study is performed within a mixed-methods approach for the usability and user experience analysis of all three user interaction modes. Each user interaction is quantitatively and qualitatively analyzed for simple and compound tasks in both standing and seated positions. The crafted mixed approach for this study allows to collect, evaluate, and validate the viability of the new 3D user interface. The results are used to draw conclusions about the suitability of the interaction modes for a variety of tasks in an immersive Virtual Reality 3D desktop environment.


Sign in / Sign up

Export Citation Format

Share Document