A Wearable Augmented Reality Navigation System for Surgical Telementoring Based on Microsoft HoloLens

2020 ◽  
Vol 49 (1) ◽  
pp. 287-298
Author(s):  
Peng Liu ◽  
Chenmeng Li ◽  
Changlin Xiao ◽  
Zeshu Zhang ◽  
Junqi Ma ◽  
...  
Electronics ◽  
2019 ◽  
Vol 8 (10) ◽  
pp. 1178 ◽  
Author(s):  
Sara Condino ◽  
Giuseppe Turini ◽  
Rosanna Viglialoro ◽  
Marco Gesi ◽  
Vincenzo Ferrari

Augmented reality (AR) technology is gaining popularity and scholarly interest in the rehabilitation sector because of the possibility to generate controlled, user-specific environmental and perceptual stimuli which motivate the patient, while still preserving the possibility to interact with the real environment and other subjects, including the rehabilitation specialist. The paper presents the first wearable AR application for shoulder rehabilitation, based on Microsoft HoloLens, with real-time markerless tracking of the user’s hand. Potentialities and current limits of commercial head-mounted displays (HMDs) are described for the target medical field, and details of the proposed application are reported. A serious game was designed starting from the analysis of a traditional rehabilitation exercise, taking into account HoloLens specifications to maximize user comfort during the AR rehabilitation session. The AR application implemented consistently meets the recommended target frame rate for immersive applications with HoloLens device: 60 fps. Moreover, the ergonomics and the motivational value of the proposed application were positively evaluated by a group of five rehabilitation specialists and 20 healthy subjects. Even if a larger study, including real patients, is necessary for a clinical validation of the proposed application, the results obtained encourage further investigations and the integration of additional technical features for the proposed AR application.


2011 ◽  
Vol 131 (7) ◽  
pp. 897-906
Author(s):  
Kengo Akaho ◽  
Takashi Nakagawa ◽  
Yoshihisa Yamaguchi ◽  
Katsuya Kawai ◽  
Hirokazu Kato ◽  
...  

Author(s):  
Christen E. Sushereba ◽  
Laura G. Militello

In this session, we will demonstrate the Virtual Patient Immersive Trainer (VPIT). The VPIT system uses augmented reality (AR) to allow medics and medical students to experience a photorealistic, life-sized virtual patient. The VPIT supports learners in obtaining the perceptual skills required to recognize and interpret subtle perceptual cues critical to assessing a patient’s condition. We will conduct an interactive demonstration of the virtual patient using both a tablet (for group interaction) and an AR-enabled headset (Microsoft HoloLens) for individual interaction. In addition, we will demonstrate use of the instructor tablet to control what the learner sees (e.g., injury types, severity of injury) and to monitor student performance.


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2234
Author(s):  
Sebastian Kapp ◽  
Michael Barz ◽  
Sergey Mukhametov ◽  
Daniel Sonntag ◽  
Jochen Kuhn

Currently an increasing number of head mounted displays (HMD) for virtual and augmented reality (VR/AR) are equipped with integrated eye trackers. Use cases of these integrated eye trackers include rendering optimization and gaze-based user interaction. In addition, visual attention in VR and AR is interesting for applied research based on eye tracking in cognitive or educational sciences for example. While some research toolkits for VR already exist, only a few target AR scenarios. In this work, we present an open-source eye tracking toolkit for reliable gaze data acquisition in AR based on Unity 3D and the Microsoft HoloLens 2, as well as an R package for seamless data analysis. Furthermore, we evaluate the spatial accuracy and precision of the integrated eye tracker for fixation targets with different distances and angles to the user (n=21). On average, we found that gaze estimates are reported with an angular accuracy of 0.83 degrees and a precision of 0.27 degrees while the user is resting, which is on par with state-of-the-art mobile eye trackers.


2020 ◽  
Author(s):  
Faiella Eliodoro ◽  
Pacella Giuseppina ◽  
Altomare Carlo ◽  
Andresciani Flavio ◽  
Zobel Beomonte Bruno ◽  
...  

2021 ◽  
Vol 11 (5) ◽  
pp. 2315
Author(s):  
Yu-Cheng Lo ◽  
Guan-An Chen ◽  
Yin Chun Liu ◽  
Yuan-Hou Chen ◽  
Jui-Ting Hsu ◽  
...  

To improve the accuracy of bracket placement in vivo, a protocol and device were introduced, which consisted of operative procedures for accurate control, a computer-aided design, and an augmented reality–assisted bracket navigation system. The present study evaluated the accuracy of this protocol. Methods: Thirty-one incisor teeth were tested from four participators. The teeth were bonded by novice and expert orthodontists. Compared with the control group by Boone gauge and the experiment group by augmented reality-assisted bracket navigation system, our study used for brackets measurement. To evaluate the accuracy, deviations of positions for bracket placement were measured. Results: The augmented reality-assisted bracket navigation system and control group were used in the same 31 cases. The priority of bonding brackets between control group or experiment group was decided by tossing coins, and then the teeth were debonded and the other technique was used. The medium vertical (incisogingival) position deviation in the control and AR groups by the novice orthodontist was 0.90 ± 0.06 mm and 0.51 ± 0.24 mm, respectively (p < 0.05), and by the expert orthodontist was 0.40 ± 0.29 mm and 0.29 ± 0.08 mm, respectively (p < 0.05). No significant changes in the horizontal position deviation were noted regardless of the orthodontist experience or use of the augmented reality–assisted bracket navigation system. Conclusion: The augmented reality–assisted bracket navigation system increased the accuracy rate by the expert orthodontist in the incisogingival direction and helped the novice orthodontist guide the bracket position within an acceptable clinical error of approximately 0.5 mm.


2021 ◽  
Vol 18 (2) ◽  
pp. 1-16
Author(s):  
Holly C. Gagnon ◽  
Carlos Salas Rosales ◽  
Ryan Mileris ◽  
Jeanine K. Stefanucci ◽  
Sarah H. Creem-Regehr ◽  
...  

Augmented reality ( AR ) is important for training complex tasks, such as navigation, assembly, and medical procedures. The effectiveness of such training may depend on accurate spatial localization of AR objects in the environment. This article presents two experiments that test egocentric distance perception in augmented reality within and at the boundaries of action space (up to 35 m) in comparison with distance perception in a matched real-world ( RW ) environment. Using the Microsoft HoloLens, in Experiment 1, participants in two different RW settings judged egocentric distances (ranging from 10 to 35 m) to an AR avatar or a real person using a visual matching measure. Distances to augmented targets were underestimated compared to real targets in the two indoor, RW contexts. Experiment 2 aimed to generalize the results to an absolute distance measure using verbal reports in one of the indoor environments. Similar to Experiment 1, distances to augmented targets were underestimated compared to real targets. We discuss these findings with respect to the importance of methodologies that directly compare performance in real and mediated environments, as well as the inherent differences present in mediated environments that are “matched” to the real world.


Sign in / Sign up

Export Citation Format

Share Document