scholarly journals Cloud rendering-based volumetric video streaming system for mixed reality services

Author(s):  
Serhan Gül ◽  
Dimitri Podborski ◽  
Jangwoo Son ◽  
Gurdeep Singh Bhullar ◽  
Thomas Buchholz ◽  
...  
2022 ◽  
Author(s):  
Daniar Estu Widiyanti ◽  
Krisma Asmoro ◽  
Soo Young Shin

Ground control station (GCS) is a system for controlling and monitoring unmanned aerial vehicle (UAV). In current GCS, the device used are considered as complex environment. This paper proposes a video streaming and speech command control for supporting mixed reality based UAV GCS using Microsoft HoloLens. Video streaming will inform the UAV view and transmit the raw video to the HoloLens, while the HoloLens steers the UAV based on the displayed UAV field of view (FoV). Using the HoloLens Mixed Reality Tool-Kit (MRTK) speech input, UAV speech control from the HoloLens was successfully implemented. Finally, experimental results based on video streaming and speech command calculation of the throughput, round-time trip, latency and speech accuracy tests are discussed to demonstrate the feasibility of the proposed scheme.


2022 ◽  
Author(s):  
Daniar Estu Widiyanti ◽  
Krisma Asmoro ◽  
Soo Young Shin

Ground control station (GCS) is a system for controlling and monitoring unmanned aerial vehicle (UAV). In current GCS, the device used are considered as complex environment. This paper proposes a video streaming and speech command control for supporting mixed reality based UAV GCS using Microsoft HoloLens. Video streaming will inform the UAV view and transmit the raw video to the HoloLens, while the HoloLens steers the UAV based on the displayed UAV field of view (FoV). Using the HoloLens Mixed Reality Tool-Kit (MRTK) speech input, UAV speech control from the HoloLens was successfully implemented. Finally, experimental results based on video streaming and speech command calculation of the throughput, round-time trip, latency and speech accuracy tests are discussed to demonstrate the feasibility of the proposed scheme.


Author(s):  
Jacqueline A. Towson ◽  
Matthew S. Taylor ◽  
Diana L. Abarca ◽  
Claire Donehower Paul ◽  
Faith Ezekiel-Wilder

Purpose Communication between allied health professionals, teachers, and family members is a critical skill when addressing and providing for the individual needs of patients. Graduate students in speech-language pathology programs often have limited opportunities to practice these skills prior to or during externship placements. The purpose of this study was to research a mixed reality simulator as a viable option for speech-language pathology graduate students to practice interprofessional communication (IPC) skills delivering diagnostic information to different stakeholders compared to traditional role-play scenarios. Method Eighty graduate students ( N = 80) completing their third semester in one speech-language pathology program were randomly assigned to one of four conditions: mixed-reality simulation with and without coaching or role play with and without coaching. Data were collected on students' self-efficacy, IPC skills pre- and postintervention, and perceptions of the intervention. Results The students in the two coaching groups scored significantly higher than the students in the noncoaching groups on observed IPC skills. There were no significant differences in students' self-efficacy. Students' responses on social validity measures showed both interventions, including coaching, were acceptable and feasible. Conclusions Findings indicated that coaching paired with either mixed-reality simulation or role play are viable methods to target improvement of IPC skills for graduate students in speech-language pathology. These findings are particularly relevant given the recent approval for students to obtain clinical hours in simulated environments.


2012 ◽  
Vol 2 (2) ◽  
pp. 134-136
Author(s):  
Ch. Divya Ch. Divya ◽  
◽  
Dr. P. Govardhan Dr. P. Govardhan

2009 ◽  
Vol E92-B (12) ◽  
pp. 3893-3902
Author(s):  
Hyeong-Min NAM ◽  
Chun-Su PARK ◽  
Seung-Won JUNG ◽  
Sung-Jea KO

2019 ◽  
Vol 2019 (1) ◽  
pp. 237-242
Author(s):  
Siyuan Chen ◽  
Minchen Wei

Color appearance models have been extensively studied for characterizing and predicting the perceived color appearance of physical color stimuli under different viewing conditions. These stimuli are either surface colors reflecting illumination or self-luminous emitting radiations. With the rapid development of augmented reality (AR) and mixed reality (MR), it is critically important to understand how the color appearance of the objects that are produced by AR and MR are perceived, especially when these objects are overlaid on the real world. In this study, nine lighting conditions, with different correlated color temperature (CCT) levels and light levels, were created in a real-world environment. Under each lighting condition, human observers adjusted the color appearance of a virtual stimulus, which was overlaid on a real-world luminous environment, until it appeared the whitest. It was found that the CCT and light level of the real-world environment significantly affected the color appearance of the white stimulus, especially when the light level was high. Moreover, a lower degree of chromatic adaptation was found for viewing the virtual stimulus that was overlaid on the real world.


2017 ◽  
Author(s):  
Dirk Schart, Nathaly Tschanz
Keyword(s):  

Author(s):  
S Leinster-Evans ◽  
J Newell ◽  
S Luck

This paper looks to expand on the INEC 2016 paper ‘The future role of virtual reality within warship support solutions for the Queen Elizabeth Class aircraft carriers’ presented by Ross Basketter, Craig Birchmore and Abbi Fisher from BAE Systems in May 2016 and the EAAW VII paper ‘Testing the boundaries of virtual reality within ship support’ presented by John Newell from BAE Systems and Simon Luck from BMT DSL in June 2017. BAE Systems and BMT have developed a 3D walkthrough training system that supports the teams working closely with the QEC Aircraft Carriers in Portsmouth and this work was presented at EAAW VII. Since then this work has been extended to demonstrate the art of the possible on Type 26. This latter piece of work is designed to explore the role of 3D immersive environments in the development and fielding of support and training solutions, across the range of support disciplines. The combined team are looking at how this digital thread leads from design of platforms, both surface and subsurface, through build into in-service support and training. This rich data and ways in which it could be used in the whole lifecycle of the ship, from design and development (used for spatial acceptance, HazID, etc) all the way through to operational support and maintenance (in conjunction with big data coming off from the ship coupled with digital tech docs for maintenance procedures) using constantly developing technologies such as 3D, Virtual Reality, Augmented Reality and Mixed Reality, will be proposed.  The drive towards gamification in the training environment to keep younger recruits interested and shortening course lengths will be explored. The paper develops the options and looks to how this technology can be used and where the value proposition lies. 


Sign in / Sign up

Export Citation Format

Share Document