scholarly journals Static Rest Frame to Improve Postural Stability in Virtual and Augmented Reality

2021 ◽  
Vol 1 ◽  
Author(s):  
Sharif Mohammad Shahnewaz Ferdous ◽  
Tanvir Irfan Chowdhury ◽  
Imtiaz Muhammad Arafat ◽  
John Quarles

Many users have shown increased postural instability while using Head-Mounted Displays (HMDs) as HMDs block their real-world vision. People with balance impairments are especially more affected by this as they depend more on their visual cues to maintain their balance. In addition, balance is a good indication of cybersickness according to postural instability theory. In this research, we have investigated how to use additional visual cues to improve postural stability. Through conducting one user study in Virtual Reality (VR) and Augmented Reality (AR), we have studied the effect of a Static Rest Frame (SRF) on postural stability in persons with balance impairments due to Multiple Sclerosis (MS). Results indicate that an SRF significantly improves postural stability in VR and AR for users with MS. Based on these results, we propose guidelines for designing more accessible VR and AR systems for persons with balance impairments.

Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2234
Author(s):  
Sebastian Kapp ◽  
Michael Barz ◽  
Sergey Mukhametov ◽  
Daniel Sonntag ◽  
Jochen Kuhn

Currently an increasing number of head mounted displays (HMD) for virtual and augmented reality (VR/AR) are equipped with integrated eye trackers. Use cases of these integrated eye trackers include rendering optimization and gaze-based user interaction. In addition, visual attention in VR and AR is interesting for applied research based on eye tracking in cognitive or educational sciences for example. While some research toolkits for VR already exist, only a few target AR scenarios. In this work, we present an open-source eye tracking toolkit for reliable gaze data acquisition in AR based on Unity 3D and the Microsoft HoloLens 2, as well as an R package for seamless data analysis. Furthermore, we evaluate the spatial accuracy and precision of the integrated eye tracker for fixation targets with different distances and angles to the user (n=21). On average, we found that gaze estimates are reported with an angular accuracy of 0.83 degrees and a precision of 0.27 degrees while the user is resting, which is on par with state-of-the-art mobile eye trackers.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Pei-Luen Patrick Rau ◽  
Jian Zheng ◽  
Zhi Guo

Purpose This study aims to investigate “immersive reading,” which occurs when individuals read text while in a virtual reality (VR) or augmented reality (AR) environment. Design/methodology/approach In Experiment 1, 64 participants read text passages and answered multiple-choice questions in VR and AR head-mounted displays (HMDs) compared with doing the same task on liquid crystal display (LCD). In Experiment 2, 31 participants performed the same reading tasks but with two VR HMDs of different display quality. Findings Compared with reading on LCD as the baseline, participants reading in VR and AR HMDs got 82% (VR) and 88% (AR) of the information accurately. Participants tended to respond more accurately and faster, though not statistically significant, with the VR HMD of higher pixel density in the speed-reading task. Originality/value The authors observed the speed and accuracy of reading in VR and AR environments, compared with the reading speed and accuracy on an LCD monitor. The authors also compared the reading performance on two VR HMDs that differed in display quality but were otherwise similar in every way.


2019 ◽  
Vol 9 (23) ◽  
pp. 5123 ◽  
Author(s):  
Diego Vaquero-Melchor ◽  
Ana M. Bernardos

Nowadays, Augmented-Reality (AR) head-mounted displays (HMD) deliver a more immersive visualization of virtual contents, but the available means of interaction, mainly based on gesture and/or voice, are yet limited and obviously lack realism and expressivity when compared to traditional physical means. In this sense, the integration of haptics within AR may help to deliver an enriched experience, while facilitating the performance of specific actions, such as repositioning or resizing tasks, that are still dependent on the user’s skills. In this direction, this paper gathers the description of a flexible architecture designed to deploy haptically enabled AR applications both for mobile and wearable visualization devices. The haptic feedback may be generated through a variety of devices (e.g., wearable, graspable, or mid-air ones), and the architecture facilitates handling the specificity of each. For this reason, within the paper, it is discussed how to generate a haptic representation of a 3D digital object depending on the application and the target device. Additionally, the paper includes an analysis of practical, relevant issues that arise when setting up a system to work with specific devices like HMD (e.g., HoloLens) and mid-air haptic devices (e.g., Ultrahaptics), such as the alignment between the real world and the virtual one. The architecture applicability is demonstrated through the implementation of two applications: (a) Form Inspector and (b) Simon Game, built for HoloLens and iOS mobile phones for visualization and for UHK for mid-air haptics delivery. These applications have been used to explore with nine users the efficiency, meaningfulness, and usefulness of mid-air haptics for form perception, object resizing, and push interaction tasks. Results show that, although mobile interaction is preferred when this option is available, haptics turn out to be more meaningful in identifying shapes when compared to what users initially expect and in contributing to the execution of resizing tasks. Moreover, this preliminary user study reveals some design issues when working with haptic AR. For example, users may be expecting a tailored interface metaphor, not necessarily inspired in natural interaction. This has been the case of our proposal of virtual pressable buttons, built mimicking real buttons by using haptics, but differently interpreted by the study participants.


Author(s):  
Pierre-Yves Laffont ◽  
Ali Hasnain ◽  
Shukri B. Jalil ◽  
Kutluhan Buyukburc ◽  
Pierre-Yves Guillemet ◽  
...  

Sensors ◽  
2021 ◽  
Vol 21 (17) ◽  
pp. 5765
Author(s):  
Soram Kim ◽  
Seungyun Lee ◽  
Hyunsuk Kang ◽  
Sion Kim ◽  
Minkyu Ahn

Since the emergence of head-mounted displays (HMDs), researchers have attempted to introduce virtual and augmented reality (VR, AR) in brain–computer interface (BCI) studies. However, there is a lack of studies that incorporate both AR and VR to compare the performance in the two environments. Therefore, it is necessary to develop a BCI application that can be used in both VR and AR to allow BCI performance to be compared in the two environments. In this study, we developed an opensource-based drone control application using P300-based BCI, which can be used in both VR and AR. Twenty healthy subjects participated in the experiment with this application. They were asked to control the drone in two environments and filled out questionnaires before and after the experiment. We found no significant (p > 0.05) difference in online performance (classification accuracy and amplitude/latency of P300 component) and user experience (satisfaction about time length, program, environment, interest, difficulty, immersion, and feeling of self-control) between VR and AR. This indicates that the P300 BCI paradigm is relatively reliable and may work well in various situations.


10.29007/7jch ◽  
2019 ◽  
Author(s):  
James Stigall ◽  
Sharad Sharma

Building occupants must know how to properly exit a building should the need ever arise. Being aware of appropriate evacuation procedures eliminates (or reduces) the risk of injury and death occurring during an existing catastrophe. Augmented reality (AR) is increasingly being sought after as a teaching and training tool because it offers a visualization and interaction capability that captures the learner’s attention and enhances the learner’s capacity to retain what was learned. Utilizing the visualization and interaction capability that AR offers and the need for emergency evacuation training, this paper explores mobile AR application (MARA) constructed to help users evacuate a building in the event of an emergency such as a building fire, active shooter, earthquake, and similar circumstances. The MARA was built for Android-based devices using Unity and Vuforia. Its features include the use of intelligent signs (i.e. visual cues to guide users to the exits) to help users evacuate a building. Inter alia, this paper discusses the MARA’s implementation and its evaluation through a user study utilizing the Technology Acceptance Model (TAM) and the System Usability Scale (SUS) frameworks. The results demonstrate the participants’ opinions that the MARA is both usable and effective in helping users evacuate a building.


2021 ◽  
Author(s):  
Trevor Nelson

Virtual Reality (VR) and Augmented Reality (AR), provide immersive experiences that are increasingly considered for implementation within Theme Parks. This paper seeks to determine the impact of virtual technologies on the Theme Parks. The method for this paper involved interviews with industry leading experts from the Theme Park industry. The interviews were structured to determine more detailed information on how they are approaching VR/AR in Theme Park attractions. Theme Parks need to provide guests with something they can’t get at home. There are many challenges with head mounted displays (HMD) in Theme Parks, as a result, several participants pointed to Mixed Reality (MR) as a better current solution. It mixes physical spaces with digital overlays with less complicated and operationally challenging technology. New attractions using VR/AR/MR technologies need to carefully consider what content they will use, mechanics of the experience and the business case to ultimately achieve overall success.


Author(s):  
Chih-Hsing Chu ◽  
Yi-An Chen ◽  
Ying-Yin Huang ◽  
Yun-Ju Lee

Abstract Virtual try-on technology (VTO) in virtual reality (VR) and augmented reality (AR) has been developed for years to create novel shopping experiences for users by allowing them to virtually wear fashion products. Compared to garments or facial accessories, fewer studies have focused on virtual footwear try-on, regardless of user study or technical development. Thus, it is necessary to examine the effectiveness of existing VTO applications on the user's affective responses. In this study, we compared the user experience of three different footwear try-on methods (real, VR, and AR) with both physiological and psychological measures. Subjects conducted a try-on experiment on different pairs of sneakers. Each subject's gaze trajectory was recorded using an eye tracker and analyzed to show his/her visual attention in each method. Afterward, the subjects completed questionnaires to assess the sense of presence, usability, and the user experience score for the try-on processes, and subsequently attended a think-aloud procedure to express their thoughts. The analysis results of the collected data showed that the user experience produced by the VR and AR try-on is not comparable to that of the real environment. The results also revealed factors that negatively affect the quality of the user's interaction with the processes. These findings may provide insights into further improvements in VTO technology.


Sign in / Sign up

Export Citation Format

Share Document