scholarly journals Building Trust in Autonomous Vehicles: Role of Virtual Reality Driving Simulators in HMI Design

2019 ◽  
Vol 68 (10) ◽  
pp. 9438-9450 ◽  
Author(s):  
Lia Morra ◽  
Fabrizio Lamberti ◽  
F. Gabriele Prattico ◽  
Salvatore La Rosa ◽  
Paolo Montuschi
Author(s):  
Alexandre M. Nascimento ◽  
Anna Carolina M. Queiroz ◽  
Lucio F. Vismari ◽  
Jeremy N. Bailenson ◽  
Paulo S. Cugnasca ◽  
...  

2004 ◽  
Vol 63 (3) ◽  
pp. 143-149 ◽  
Author(s):  
Fred W. Mast ◽  
Charles M. Oman

The role of top-down processing on the horizontal-vertical line length illusion was examined by means of an ambiguous room with dual visual verticals. In one of the test conditions, the subjects were cued to one of the two verticals and were instructed to cognitively reassign the apparent vertical to the cued orientation. When they have mentally adjusted their perception, two lines in a plus sign configuration appeared and the subjects had to evaluate which line was longer. The results showed that the line length appeared longer when it was aligned with the direction of the vertical currently perceived by the subject. This study provides a demonstration that top-down processing influences lower level visual processing mechanisms. In another test condition, the subjects had all perceptual cues available and the influence was even stronger.


Author(s):  
Daniela Mazzaccaro ◽  
Rim Miri ◽  
Bilel Derbel ◽  
Paolo Righini ◽  
Giovanni Nano

Author(s):  
S Leinster-Evans ◽  
J Newell ◽  
S Luck

This paper looks to expand on the INEC 2016 paper ‘The future role of virtual reality within warship support solutions for the Queen Elizabeth Class aircraft carriers’ presented by Ross Basketter, Craig Birchmore and Abbi Fisher from BAE Systems in May 2016 and the EAAW VII paper ‘Testing the boundaries of virtual reality within ship support’ presented by John Newell from BAE Systems and Simon Luck from BMT DSL in June 2017. BAE Systems and BMT have developed a 3D walkthrough training system that supports the teams working closely with the QEC Aircraft Carriers in Portsmouth and this work was presented at EAAW VII. Since then this work has been extended to demonstrate the art of the possible on Type 26. This latter piece of work is designed to explore the role of 3D immersive environments in the development and fielding of support and training solutions, across the range of support disciplines. The combined team are looking at how this digital thread leads from design of platforms, both surface and subsurface, through build into in-service support and training. This rich data and ways in which it could be used in the whole lifecycle of the ship, from design and development (used for spatial acceptance, HazID, etc) all the way through to operational support and maintenance (in conjunction with big data coming off from the ship coupled with digital tech docs for maintenance procedures) using constantly developing technologies such as 3D, Virtual Reality, Augmented Reality and Mixed Reality, will be proposed.  The drive towards gamification in the training environment to keep younger recruits interested and shortening course lengths will be explored. The paper develops the options and looks to how this technology can be used and where the value proposition lies. 


2018 ◽  
Author(s):  
Lorraine Tudor Car ◽  
Bhone Myint Kyaw ◽  
Josip Car

BACKGROUND Digital technology called Virtual Reality (VR) is increasingly employed in health professions’ education. Yet, based on the current evidence, its use is narrowed around a few most applications and disciplines. There is a lack of an overview that would capture the diversity of different VR applications in health professions’ education and inform its use and research. OBJECTIVE This narrative review aims to explore different potential applications of VR in health professions’ education. METHODS The narrative synthesis approach to literature review was used to analyse the existing evidence. RESULTS We outline the role of VR features such as immersion, interactivity and feedback and explain the role of VR devices. Based on the type and scope of educational content VR can represent space, individuals, objects, structures or their combination. Application of VR in medical education encompasses environmental, organ and micro level. Environmental VR focuses on training in relation to health professionals’ environment and human interactions. Organ VR educational content targets primarily human body anatomy; and micro VR microscopic structures at the level of cells, molecules and atoms. We examine how different VR features and health professional education areas match these three VR types. CONCLUSIONS We conclude by highlighting the gaps in the literature and providing suggestions for future research.


2021 ◽  
pp. 0887302X2199428
Author(s):  
Hyejune Park ◽  
Seeun Kim

The purpose of this study is to examine the effects of the “virtual try-on” technology (AR) and the “3D virtual store” (VR) incorporated in an apparel retail website on purchase intentions. This study highlights the mediating role of cognitive elaboration in the process through which these technologies influence purchase intentions, and examines the way consumers’ shopping goals (searching vs. browsing) interact with the website technology and influence their responses. The two experiments demonstrated that, for browsers, the website with VR was more effective in increasing purchase intentions than were the website with AR or a regular website with no technology, while for searchers, both the website with AR and the website with VR were more effective than was a regular website. In addition, cognitive elaboration mediated the interaction between a technology and a shopping goal on purchase intentions for browsers, while such a mediating effect was not found in searchers.


Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 26
Author(s):  
David González-Ortega ◽  
Francisco Javier Díaz-Pernas ◽  
Mario Martínez-Zarzuela ◽  
Míriam Antón-Rodríguez

Driver’s gaze information can be crucial in driving research because of its relation to driver attention. Particularly, the inclusion of gaze data in driving simulators broadens the scope of research studies as they can relate drivers’ gaze patterns to their features and performance. In this paper, we present two gaze region estimation modules integrated in a driving simulator. One uses the 3D Kinect device and another uses the virtual reality Oculus Rift device. The modules are able to detect the region, out of seven in which the driving scene was divided, where a driver is gazing at in every route processed frame. Four methods were implemented and compared for gaze estimation, which learn the relation between gaze displacement and head movement. Two are simpler and based on points that try to capture this relation and two are based on classifiers such as MLP and SVM. Experiments were carried out with 12 users that drove on the same scenario twice, each one with a different visualization display, first with a big screen and later with Oculus Rift. On the whole, Oculus Rift outperformed Kinect as the best hardware for gaze estimation. The Oculus-based gaze region estimation method with the highest performance achieved an accuracy of 97.94%. The information provided by the Oculus Rift module enriches the driving simulator data and makes it possible a multimodal driving performance analysis apart from the immersion and realism obtained with the virtual reality experience provided by Oculus.


Sign in / Sign up

Export Citation Format

Share Document