scholarly journals A High Fidelity Mixed Reality System for Remote Collaboration

2021 ◽  
Author(s):  
◽  
Stephen Thompson

<p>This thesis presents a novel system for enabling remote collaboration within a mixed reality environment. Since the increase of virtual and augmented reality headsets, there has been increased interest in improving remote collaboration. Systems have been proposed to use 3D geometry or 360° video for providing remotely collaborating users with a view of the local, real-world environment. However, many systems provide limited interactions in the local environment and target using coupled views of all users, rather than simulating face-to-face interactions, or use virtual environments for the remote user, losing visual realism.  The presented system enables a user situated in a remote location to join a local user to collaborate on a task. An omni-directional camera is streamed to the remote user in real-time to provide a live view of the local space. The 360° video is also used to provide believable lighting when compositing virtual objects into the real-world. Remote users are displayed to local users as an abstracted avatar to provide basic body gestures and social presence. Voice chat is also provided for verbal communication.  The system has been evaluated for technical performance and user experience. The evaluation found the performance of the system was suitable for real-time collaboration. Remote and local users were also found to have similar satisfaction with the system, experiencing high levels of presence, social presence and tele-presence. Shared cinematic and remote presentations are suggested as possible applications to guide further development of the system.</p>

2021 ◽  
Author(s):  
◽  
Stephen Thompson

<p>This thesis presents a novel system for enabling remote collaboration within a mixed reality environment. Since the increase of virtual and augmented reality headsets, there has been increased interest in improving remote collaboration. Systems have been proposed to use 3D geometry or 360° video for providing remotely collaborating users with a view of the local, real-world environment. However, many systems provide limited interactions in the local environment and target using coupled views of all users, rather than simulating face-to-face interactions, or use virtual environments for the remote user, losing visual realism.  The presented system enables a user situated in a remote location to join a local user to collaborate on a task. An omni-directional camera is streamed to the remote user in real-time to provide a live view of the local space. The 360° video is also used to provide believable lighting when compositing virtual objects into the real-world. Remote users are displayed to local users as an abstracted avatar to provide basic body gestures and social presence. Voice chat is also provided for verbal communication.  The system has been evaluated for technical performance and user experience. The evaluation found the performance of the system was suitable for real-time collaboration. Remote and local users were also found to have similar satisfaction with the system, experiencing high levels of presence, social presence and tele-presence. Shared cinematic and remote presentations are suggested as possible applications to guide further development of the system.</p>


2021 ◽  
Author(s):  
Hye Jin Kim

<p><b>Telepresence systems enable people to feel present in a remote space while their bodies remain in their local space. To enhance telepresence, the remote environment needs to be captured and visualised in an immersive way. For instance, 360-degree videos (360-videos) shown on head-mounted displays (HMDs) provide high fidelity telepresence in a remote place. Mixed reality (MR) in 360-videos enables interactions with virtual objects blended in the captured remote environment while it allows telepresence only for a single user wearing HMD. For this reason, it has limitations when multiple users want to experience telepresence together and naturally collaborate within a teleported space. </b></p><p>This thesis presents TeleGate, a novel multi-user teleportation platform for remote collaboration in a MR space. TeleGate provides "semi-teleportation" into the MR space using large-scale displays, acting as a bridge between the local physical communication space and the remote collaboration space created by MR with captured 360-videos. Our proposed platform enables multi-user semi-teleportation to perform collaborative tasks in the remote MR collaboration (MRC) space while allowing for natural communication between collaborators in the same local physical space. </p><p>We implemented a working prototype of TeleGate and then conducted a user study to evaluate our concept of semi-teleportation. We measured the spatial presence, social presence while participants performed remote collaborative tasks in the MRC space. Additionally, we also explored the different control mechanisms within the platform in the remote MR collaboration scenario. </p><p>In conclusion, TeleGate enabled multiple co-located users to semi-teleport together using large-scale displays for remote collaboration in MR 360-videos.</p>


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1123
Author(s):  
David Jurado ◽  
Juan M. Jurado ◽  
Lidia Ortega ◽  
Francisco R. Feito

Mixed reality (MR) enables a novel way to visualize virtual objects on real scenarios considering physical constraints. This technology arises with other significant advances in the field of sensors fusion for human-centric 3D capturing. Recent advances for scanning the user environment, real-time visualization and 3D vision using ubiquitous systems like smartphones allow us to capture 3D data from the real world. In this paper, a disruptive application for assessing the status of indoor infrastructures is proposed. The installation and maintenance of hidden facilities such as water pipes, electrical lines and air conditioning tubes, which are usually occluded behind the wall, supposes tedious and inefficient tasks. Most of these infrastructures are digitized but they cannot be visualized onsite. In this research, we focused on the development of a new application (GEUINF) to be launched on smartphones that are capable of capturing 3D data of the real world by depth sensing. This information is relevant to determine the user position and orientation. Although previous approaches used fixed markers for this purpose, our application enables the estimation of both parameters with a centimeter accuracy without them. This novelty is possible since our method is based on a matching process between reconstructed walls of the real world and 3D planes of the replicated world in a virtual environment. Our markerless approach is based on scanning planar surfaces of the user environment and then, these are geometrically aligned with their corresponding virtual 3D entities. In a preprocessing phase, the 2D CAD geometry available from an architectural project is used to generate 3D models of an indoor building structure. In real time, these virtual elements are tracked with the real ones modeled by using ARCore library. Once the alignment between virtual and real worlds is done, the application enables the visualization, navigation and interaction with the virtual facility networks in real-time. Thus, our method may be used by private companies and public institutions responsible of the indoor facilities management and also may be integrated with other applications focused on indoor navigation.


2021 ◽  
Author(s):  
Hye Jin Kim

<p><b>Telepresence systems enable people to feel present in a remote space while their bodies remain in their local space. To enhance telepresence, the remote environment needs to be captured and visualised in an immersive way. For instance, 360-degree videos (360-videos) shown on head-mounted displays (HMDs) provide high fidelity telepresence in a remote place. Mixed reality (MR) in 360-videos enables interactions with virtual objects blended in the captured remote environment while it allows telepresence only for a single user wearing HMD. For this reason, it has limitations when multiple users want to experience telepresence together and naturally collaborate within a teleported space. </b></p><p>This thesis presents TeleGate, a novel multi-user teleportation platform for remote collaboration in a MR space. TeleGate provides "semi-teleportation" into the MR space using large-scale displays, acting as a bridge between the local physical communication space and the remote collaboration space created by MR with captured 360-videos. Our proposed platform enables multi-user semi-teleportation to perform collaborative tasks in the remote MR collaboration (MRC) space while allowing for natural communication between collaborators in the same local physical space. </p><p>We implemented a working prototype of TeleGate and then conducted a user study to evaluate our concept of semi-teleportation. We measured the spatial presence, social presence while participants performed remote collaborative tasks in the MRC space. Additionally, we also explored the different control mechanisms within the platform in the remote MR collaboration scenario. </p><p>In conclusion, TeleGate enabled multiple co-located users to semi-teleport together using large-scale displays for remote collaboration in MR 360-videos.</p>


Author(s):  
Eduardo Veras ◽  
Karan Khokar ◽  
Kathryn De Laurentis ◽  
Rajiv Dubey

In this paper we describe the implementation of a system that gives assistive force feedback to the remote user in a teleoperation based environment. The force feedback would help the user in trajectory following exercises for stroke rehabilitation. Exercises in virtual environments on a PC as well as real world exercises with a remote robotic arm, that would follow a trajectory in real-time as the user moves the master device, could be performed. Such real world exercises augmented with real-time force feedback can make the exercises more effective as the user gets force assistance along the desired path. Moreover, the system can find its application in remote therapy, where the therapist is away from the user, as it can be teleoperated and has internet based protocols. The assistive force feedback has been implemented using simple sensors such as the camera and the laser and a PC-based real-time multithreaded control system. The real-time force feedback from the remote robot to the master device has been possible using effective multithreading programming strategies in the control system design and novel sensor integration. The system also has the capabilities of autonomous as well as supervisory control of the remote robot and is modular as far as integration of different master devices for stroke rehabilitation exercises is concerned.


2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
J D Kasprzak ◽  
M Kierepka ◽  
J Z Peruga ◽  
D Dudek ◽  
B Machura ◽  
...  

Abstract Background Three-dimensional (3D) echocardiographic data acquired from transesophageal (TEE) window are commonly used in planning and during percutaneous structural cardiac interventions (PSCI). Purpose We hypothesized that innovative, interactive mixed reality display can be integrated with procedural PSCI workflow to improve perception and interpretation of 3D data representing cardiac anatomy. Methods 3D TEE datasets were acquired before, during and after the completion of PSCI in 8 patients (occluders: 2 atrial appendage, 2 patent foramen ovale and 3 atrial septal implantations and percutaneous mitral commissurotomy). 30 Carthesian DICOM files were used to test the feasibility of mixed reality with commercially available head-mounted device (overlying hologram of 3D TEE data onto real-world view) as display for the interventional or imaging operator. Dedicated software was used for files conversion and 3D rendering of data to display device (in 1 case real-time Wi-Fi streaming from echocardiograph) and spatial manipulation of hologram during PSCI. Custom viewer was used to perform volume rendering and adjustment (cropping, transparency and shading control). Results Pre- and intraprocedural 3D TEE was performed in all 8 patients (5 women, age 40–83). Thirty selected 3DTEE datasets were successfully transferred and displayed in mixed reality head-mounted device as a holographic image overlying the real world view. The analysis was performed both before and during the procedure and compared with flatscreen 2-D display of the echocardiograph. In one case, real-time data transfer was successfully implemented during mitral balloon commissurotomy. The quality of visualization was judged as good without diagnostic content loss in all (100%) datasets. Both target structures and additional anatomical details were clearly presented including fenestrations of atrial septal defect, prominent Eustachian valve and earlier cardiac implants. Volume rendered views were touchlessly manipulated and displayed with a selection of intensity windows, transfer functions, and filters. Detail display was judged comparable to current 2-D volume-rendering on commercial workstations and touchless user interface - comfortable for optimization of views during PSCI. Conclusions Mixed reality display using a commercially available head-mounted device can be successfully integrated with preparation and execution of PSCI. The benefits of this solution include touchless image control and unobstructed real world viewing facilitating intraprocedural use, thus showing superiority over virtual or enhanced reality solutions. Expected progress includes integration of color flow data and optimization of real-time streaming option.


2020 ◽  
Vol 10 (2) ◽  
Author(s):  
Fazliaty Edora Fadzli ◽  
Ajune Wanis Ismail

Mixed Reality (MR) is a technology which enable to bring a virtual element into the real-world environment. MR intends to improve reality on the virtual world immerse onto real-world space. Occasionally the MR has been improved as the display technologies advanced progressively. In MR collaborative interface context, the local and remote user work together on collaborative task while sense the immersive environment in the cooperative application. User telepresence is an immersive telepresence, where the reconstruction of a human appears in a real-life. Up till now, producing full telepresence of the life-size human body may require a high transmission bandwidth of the internet. Therefore, this paper explores on a robust real-time 3D reconstruction method for MR telepresence. This paper discusses the previous works on the reconstruction method of a full-body human and the existing research works that have proposed the reconstruction methods for telepresence. Besides the 3D reconstruction method, this paper also enlightens our recent finding on the MR framework to transport a full-body human from a local location to a remote location. The MR telepresence will be discussed, as well as the robust 3D reconstruction method which has been implemented with user telepresence feature where the user experiences an accurate 3D representation of a remote person. The paper ends with the discussion and results, MR telepresence with robust 3D reconstruction method to execute user telepresence.


Sign in / Sign up

Export Citation Format

Share Document