scholarly journals Auditable Augmented/Mixed/Virtual Reality

Author(s):  
Richard Cloete ◽  
Chris Norval ◽  
Jatinder Singh

Virtual, Augmented and Mixed Reality (XR) technologies are becoming increasingly pervasive. However, the contextual nature of XR, and its tight coupling of the digital and physical environments, brings real propensity for loss and harm. This means that auditability---the ability to inspect how a system operates---will be crucial for dealing with incidents as they occur, by providing the information enabling rectification, repair and recourse. However, supporting audit in XR brings considerations, as the process of capturing audit data itself has implications and challenges, both for the application (e.g., overheads) and more broadly. This paper explores the practicalities of auditing XR systems, characterises the tensions between audit and other considerations, and argues the need for flexible tools enabling the management of such. In doing so, we introduce Droiditor, a configurable open-source Android toolkit that enables the runtime capture of audit-relevant data from mobile applications. We use Droiditor as a means to indicate some potential implications of audit data capture, demonstrate how greater configurability can assist in managing audit-related concerns, and discuss the potential considerations that result. Given the societal demands for more transparent and accountable systems, our broader aim is to draw attention to auditability, highlighting tangible ways forward and areas for future work.

Author(s):  
A. Kharroubi ◽  
R. Hajji ◽  
R. Billen ◽  
F. Poux

Abstract. With the increasing volume of 3D applications using immersive technologies such as virtual, augmented and mixed reality, it is very interesting to create better ways to integrate unstructured 3D data such as point clouds as a source of data. Indeed, this can lead to an efficient workflow from 3D capture to 3D immersive environment creation without the need to derive 3D model, and lengthy optimization pipelines. In this paper, the main focus is on the direct classification and integration of massive 3D point clouds in a virtual reality (VR) environment. The emphasis is put on leveraging open-source frameworks for an easy replication of the findings. First, we develop a semi-automatic segmentation approach to provide semantic descriptors (mainly classes) to groups of points. We then build an octree data structure leveraged through out-of-core algorithms to load in real time and continuously only the points that are in the VR user's field of view. Then, we provide an open-source solution using Unity with a user interface for VR point cloud interaction and visualisation. Finally, we provide a full semantic VR data integration enhanced through developed shaders for future spatio-semantic queries. We tested our approach on several datasets of which a point cloud composed of 2.3 billion points, representing the heritage site of the castle of Jehay (Belgium). The results underline the efficiency and performance of the solution for visualizing classifieds massive point clouds in virtual environments with more than 100 frame per second.


Author(s):  
S Leinster-Evans ◽  
J Newell ◽  
S Luck

This paper looks to expand on the INEC 2016 paper ‘The future role of virtual reality within warship support solutions for the Queen Elizabeth Class aircraft carriers’ presented by Ross Basketter, Craig Birchmore and Abbi Fisher from BAE Systems in May 2016 and the EAAW VII paper ‘Testing the boundaries of virtual reality within ship support’ presented by John Newell from BAE Systems and Simon Luck from BMT DSL in June 2017. BAE Systems and BMT have developed a 3D walkthrough training system that supports the teams working closely with the QEC Aircraft Carriers in Portsmouth and this work was presented at EAAW VII. Since then this work has been extended to demonstrate the art of the possible on Type 26. This latter piece of work is designed to explore the role of 3D immersive environments in the development and fielding of support and training solutions, across the range of support disciplines. The combined team are looking at how this digital thread leads from design of platforms, both surface and subsurface, through build into in-service support and training. This rich data and ways in which it could be used in the whole lifecycle of the ship, from design and development (used for spatial acceptance, HazID, etc) all the way through to operational support and maintenance (in conjunction with big data coming off from the ship coupled with digital tech docs for maintenance procedures) using constantly developing technologies such as 3D, Virtual Reality, Augmented Reality and Mixed Reality, will be proposed.  The drive towards gamification in the training environment to keep younger recruits interested and shortening course lengths will be explored. The paper develops the options and looks to how this technology can be used and where the value proposition lies. 


Author(s):  
Stefan Bittmann

Virtual reality (VR) is the term used to describe representation and perception in a computer-generated, virtual environment. The term was coined by author Damien Broderick in his 1982 novel “The Judas Mandala". The term "Mixed Reality" describes the mixing of virtual reality with pure reality. The term "hyper-reality" is also used. Immersion plays a major role here. Immersion describes the embedding of the user in the virtual world. A virtual world is considered plausible if the interaction is logical in itself. This interactivity creates the illusion that what seems to be happening is actually happening. A common problem with VR is "motion sickness." To create a sense of immersion, special output devices are needed to display virtual worlds. Here, "head-mounted displays", CAVE and shutter glasses are mainly used. Input devices are needed for interaction: 3D mouse, data glove, flystick as well as the omnidirectional treadmill, with which walking in virtual space is controlled by real walking movements, play a role here.


Author(s):  
Randall Spain ◽  
Benjamin Goldberg ◽  
Jeffrey Hansberger ◽  
Tami Griffith ◽  
Jeremy Flynn ◽  
...  

Recent advances in technology have made virtual environments, virtual reality, augmented reality, and simulations more affordable and accessible to researchers, companies, and the general public, which has led to many novel use cases and applications. A key objective of human factors research and practice is determining how these technology-rich applications can be designed and applied to improve human performance across a variety of contexts. This session will demonstrate some of the distinct and diverse uses of virtual environments and mixed reality environments in an alternative format. The session will begin with each demonstrator providing a brief overview of their virtual environment (VE) and a description of how it has been used to address a particular problem or research need. Following the description portion of the session, each VE will be set-up at a demonstration station in the room, and session attendees will be encouraged to directly interact with the virtual environment and ask demonstrators questions about their research and inquire about the effectiveness of using VE for research, training, and evaluation purposes. The overall objective of this alternative session is to increase the awareness of how human factors professionals use VE technologies and increase the awareness of the capabilities and limitations of VE in supporting the work of HF professionals.


2019 ◽  
Vol 12 (1) ◽  
pp. 50-71
Author(s):  
María Vanessa Villasana ◽  
Ivan Miguel Pires ◽  
Juliana Sá ◽  
Nuno M. Garcia ◽  
Eftim Zdravevski ◽  
...  

Background: Mobile applications can be used for the monitoring of lifestyles and physical activity. It can be installed in commodity mobile devices, which are currently used by different types of people in their daily activities worlwide . Objective: This paper reviews and categorizes the mobile applications related to diet, nutrition, health, physical activity and education, showing the analysis of 73 mobile applications available on Google Play Store with the extraction of the different features. Methods: The mobile applications were analyzed in relation to each proposed category and their features, starting with the definition of the search keywords used in the Google Play Store. Each mobile application was installed on a smartphone, and validated whether it was researched in scientific studies. Finally, all mobile applications and features were categorized. Results: These mobile applications were clustered into four groups, including diet and nutrition, health, physical activity and education. The features of mobile applications were also categorized into six groups, including diet, anthropometric parameters, social, physical activity, medical parameters and vital parameters. The most available features of the mobile applications are weight, height, age, gender, goals, calories needed calculation, diet diary, food database with calories, calories burned and calorie intake. Conclusion: With this review, it was concluded that most mobile applications available in the market are related to diet, and they are important for different types of people. A promising idea for future work is to evaluate the acceptance by young people of such mobile applications.


2019 ◽  
Vol 13 (1) ◽  
pp. 50-71 ◽  
Author(s):  
María Vanessa Villasana ◽  
Ivan Miguel Pires ◽  
Juliana Sá ◽  
Nuno M. Garcia ◽  
Eftim Zdravevski ◽  
...  

Background: Mobile applications can be used for the monitoring of lifestyles and physical activity. It can be installed in commodity mobile devices, which are currently used by different types of people in their daily activities worlwide . Objective: This paper reviews and categorizes the mobile applications related to diet, nutrition, health, physical activity and education, showing the analysis of 73 mobile applications available on Google Play Store with the extraction of the different features. Methods: The mobile applications were analyzed in relation to each proposed category and their features, starting with the definition of the search keywords used in the Google Play Store. Each mobile application was installed on a smartphone, and validated whether it was researched in scientific studies. Finally, all mobile applications and features were categorized. Results: These mobile applications were clustered into four groups, including diet and nutrition, health, physical activity and education. The features of mobile applications were also categorized into six groups, including diet, anthropometric parameters, social, physical activity, medical parameters and vital parameters. The most available features of the mobile applications are weight, height, age, gender, goals, calories needed calculation, diet diary, food database with calories, calories burned and calorie intake. Conclusion: With this review, it was concluded that most mobile applications available in the market are related to diet, and they are important for different types of people. A promising idea for future work is to evaluate the acceptance by young people of such mobile applications.


2018 ◽  
Author(s):  
Jorge A Fuentes ◽  
Rodrigo Nieto ◽  
Francisca Melis ◽  
Luz María González ◽  
Gonzalo Mauricio Rojas ◽  
...  

To feel fear in a specific situation is a normal human experience, however, when this fear or aversion becomes excessive and disrupts the day to day life of an individual, it is said the person suffers from a type of anxiety disorder called phobia. One common type of treatment for phobias is exposure therapy (professionals expose the patient gradually to the feared object or situation).The objective of this paper is to implement a Virtual Reality system that simulates a real highway environment which allows to treat patients affected by highway phobias in a safe place.In cooperation with psychologists and psychiatrists, an action protocol was conducted to create and recreate the variables of the virtual environment to which the patient will be subjected to. Once this was completed, a Virtual Reality application was made that simulates a realistic highway which includes exits, overpasses, underpasses, and tunnels, among others.This hardware/software system will include Oculus Rift DK2 VR glasses in order to create an immersive environment that the patient can consider real and who will be able to interact with it. The performance of the vehicle was programmed through physical responses similar to reality as well as techniques of artificial intelligence in the vehicles that will interact with the one controlled by the patient. Also, this system includes a steering wheel, pedals, and a gearshift (manual or automatic).We think that this system will contribute to treating highway phobias, allowing the psychiatrist or psychologist to carry out therapy in an appropriate manner and through the support of technology the professional will have the ability to simulate the anxiogenic environment in a realistic manner so as to achieve effective treatment. In a future work, we must quantify the possible benefits of this type of VR system in phobia patients.


2021 ◽  
Vol 82 (4) ◽  
pp. 186
Author(s):  
Kathleen Phillips ◽  
Valerie A. Lynn ◽  
Amie Yenser ◽  
Christina Wissinger

Current teaching practice in undergraduate higher education anatomy and physiology courses incorporates the use of various instructional methodologies to reinforce the anatomical relationships between structures.1,2 These methods can include basic hands-on physical models, human and animal dissection labs, and interactive technology. Technological advances continue to drive the production of innovative anatomy and physiology electronic tools, including:virtual dissection in 3-D (e.g., Virtual Dissection Boards from Anatomage, 3D4Medical, and Anatomy.TV),augmented reality (AR) (e.g., Human Anatomy Atlas),mixed reality (e.g., Microsoft HoloLens Case Western Reserve Medical School and Cleveland Clinic digital anatomy app), and3-D virtual reality (VR) (e.g., 3D Organon VR Anatomy and YOU by Sharecare apps).


Sign in / Sign up

Export Citation Format

Share Document