Estimating Distances in Action Space in Augmented Reality

2021 ◽  
Vol 18 (2) ◽  
pp. 1-16
Author(s):  
Holly C. Gagnon ◽  
Carlos Salas Rosales ◽  
Ryan Mileris ◽  
Jeanine K. Stefanucci ◽  
Sarah H. Creem-Regehr ◽  
...  

Augmented reality ( AR ) is important for training complex tasks, such as navigation, assembly, and medical procedures. The effectiveness of such training may depend on accurate spatial localization of AR objects in the environment. This article presents two experiments that test egocentric distance perception in augmented reality within and at the boundaries of action space (up to 35 m) in comparison with distance perception in a matched real-world ( RW ) environment. Using the Microsoft HoloLens, in Experiment 1, participants in two different RW settings judged egocentric distances (ranging from 10 to 35 m) to an AR avatar or a real person using a visual matching measure. Distances to augmented targets were underestimated compared to real targets in the two indoor, RW contexts. Experiment 2 aimed to generalize the results to an absolute distance measure using verbal reports in one of the indoor environments. Similar to Experiment 1, distances to augmented targets were underestimated compared to real targets. We discuss these findings with respect to the importance of methodologies that directly compare performance in real and mediated environments, as well as the inherent differences present in mediated environments that are “matched” to the real world.

2019 ◽  
Vol 11 (2) ◽  
pp. 1-18 ◽  
Author(s):  
Lee Lisle ◽  
Coleman Merenda ◽  
Kyle Tanous ◽  
Hyungil Kim ◽  
Joseph L. Gabbard ◽  
...  

Many driving scenarios involve correctly perceiving road elements in depth and manually responding as appropriate. Of late, augmented reality (AR) head-up displays (HUDs) have been explored to assist drivers in identifying road elements, by using a myriad of AR interface designs that include world-fixed graphics perceptually placed in the forward driving scene. Volumetric AR HUDs purportedly offer increased accuracy of distance perception through natural presentation of oculomotor cues as compared to traditional HUDs. In this article, the authors quantify participant performance matching virtual objects to real-world counterparts at egocentric distances of 7-12 meters while using both volumetric and fixed-focal plane AR HUDs. The authors found the volumetric HUD to be associated with faster and more accurate depth judgements at far distance, and that participants performed depth judgements more quickly as the experiment progressed. The authors observed no differences between the two displays in terms of reported simulator sickness or eye strain.


Frameless ◽  
2019 ◽  
Vol 1 (1) ◽  
pp. 37-38
Author(s):  
Michael J. Murdoch ◽  
◽  
Nargess Hassani ◽  
Sara Leary ◽  
◽  
...  

This presentation will summarize recent work on the visual perception of color appearance and object properties in optical see-through (OST) augmented reality (AR) systems. OST systems, such as Microsoft HoloLens, use a see- through display system to superimpose virtual content onto a user’s view of the real world. With careful tracking of both display and world coordinates, synthetic objects can be added to the real world, and real objects can be manipulated via synthetic overlays. Ongoing research studies how the combination of real and virtual stimuli are perceived and how users’ visual adaptation is affected; two specific examples will be explained.


Information ◽  
2021 ◽  
Vol 12 (12) ◽  
pp. 519
Author(s):  
Inma García-Pereira ◽  
Pablo Casanova-Salas ◽  
Jesús Gimeno ◽  
Pedro Morillo ◽  
Dirk Reiners

Augmented Reality (AR) annotations are a powerful way of communication when collaborators cannot be present at the same time in a given environment. However, this situation presents several challenges, for example: how to record the AR annotations for later consumption, how to align virtual and real world in unprepared environments or how to offer the annotations to users with different AR devices. In this paper we present a cross-device AR annotation method that allows users to create and display annotations asynchronously in environments without the need for prior preparation (AR markers, point cloud capture, etc.). This is achieved through an easy user-assisted calibration process and a data model that allows any type of annotation to be stored on any device. The experimental study carried out with 40 participants has verified our two hypotheses: we are able to visualize AR annotations in indoor environments without prior preparation regardless of the device used and the overall usability of the system is satisfactory.


2019 ◽  
Vol 2019 (1) ◽  
pp. 237-242
Author(s):  
Siyuan Chen ◽  
Minchen Wei

Color appearance models have been extensively studied for characterizing and predicting the perceived color appearance of physical color stimuli under different viewing conditions. These stimuli are either surface colors reflecting illumination or self-luminous emitting radiations. With the rapid development of augmented reality (AR) and mixed reality (MR), it is critically important to understand how the color appearance of the objects that are produced by AR and MR are perceived, especially when these objects are overlaid on the real world. In this study, nine lighting conditions, with different correlated color temperature (CCT) levels and light levels, were created in a real-world environment. Under each lighting condition, human observers adjusted the color appearance of a virtual stimulus, which was overlaid on a real-world luminous environment, until it appeared the whitest. It was found that the CCT and light level of the real-world environment significantly affected the color appearance of the white stimulus, especially when the light level was high. Moreover, a lower degree of chromatic adaptation was found for viewing the virtual stimulus that was overlaid on the real world.


2018 ◽  
Author(s):  
Kyle Plunkett

This manuscript provides two demonstrations of how Augmented Reality (AR), which is the projection of virtual information onto a real-world object, can be applied in the classroom and in the laboratory. Using only a smart phone and the free HP Reveal app, content rich AR notecards were prepared. The physical notecards are based on Organic Chemistry I reactions and show only a reagent and substrate. Upon interacting with the HP Reveal app, an AR video projection shows the product of the reaction as well as a real-time, hand-drawn curved-arrow mechanism of how the product is formed. Thirty AR notecards based on common Organic Chemistry I reactions and mechanisms are provided in the Supporting Information and are available for widespread use. In addition, the HP Reveal app was used to create AR video projections onto laboratory instrumentation so that a virtual expert can guide the user during the equipment setup and operation.


10.28945/2207 ◽  
2015 ◽  
Vol 10 ◽  
pp. 021-035 ◽  
Author(s):  
Yan Lu ◽  
Joseph T. Chao ◽  
Kevin R. Parker

This project shows a creative approach to the familiar scavenger hunt game. It involved the implementation of an iPhone application, HUNT, with Augmented Reality (AR) capability for the users to play the game as well as an administrative website that game organizers can use to create and make available games for users to play. Using the HUNT mobile app, users will first make a selection from a list of games, and they will then be shown a list of objects that they must seek. Once the user finds a correct object and scans it with the built-in camera on the smartphone, the application will attempt to verify if it is the correct object and then display associated multi-media AR content that may include images and videos overlaid on top of real world views. HUNT not only provides entertaining activities within an environment that players can explore, but the AR contents can serve as an educational tool. The project is designed to increase user involvement by using a familiar and enjoyable game as a basis and adding an educational dimension by incorporating AR technology and engaging and interactive multimedia to provide users with facts about the objects that they have located


Author(s):  
Kaori Kashimura ◽  
Takafumi Kawasaki Jr. ◽  
Nozomi Ikeya ◽  
Dave Randall

This chapter provides an ethnography of a complex scenario involving the construction of a power plant and, in so doing, tries to show the importance of a practice-based approach to the problem of technical and organizational change. The chapter reports on fieldwork conducted in a highly complex and tightly coupled environment: power plant construction. The ethnography describes work practices on three different sites and describes and analyses their interlocking dependencies, showing the difficulties encountered at each location and the way in which the delays that result cascade through the different sites. It goes on to describe some technological solutions that are associated with augmented reality and that are being designed in response to the insights gained from the fieldwork. The chapter also reflects more generally on the relationship between fieldwork and design in real-world contexts.


Author(s):  
Christen E. Sushereba ◽  
Laura G. Militello

In this session, we will demonstrate the Virtual Patient Immersive Trainer (VPIT). The VPIT system uses augmented reality (AR) to allow medics and medical students to experience a photorealistic, life-sized virtual patient. The VPIT supports learners in obtaining the perceptual skills required to recognize and interpret subtle perceptual cues critical to assessing a patient’s condition. We will conduct an interactive demonstration of the virtual patient using both a tablet (for group interaction) and an AR-enabled headset (Microsoft HoloLens) for individual interaction. In addition, we will demonstrate use of the instructor tablet to control what the learner sees (e.g., injury types, severity of injury) and to monitor student performance.


Sign in / Sign up

Export Citation Format

Share Document