3DGAM: using 3D gesture and CAD models for training on mixed reality remote collaboration

Author(s):  
Peng Wang ◽  
Xiaoliang Bai ◽  
Mark Billinghurst ◽  
Shusheng Zhang ◽  
Sili Wei ◽  
...  
Author(s):  
Yuko Chinone ◽  
Hideki Aoyama ◽  
Tetsuo Oya

Three-dimensional models (CAD models) are constructed in the design processes of products because they are effective for design evaluation processes using CAE systems and manufacturing processes using CAM systems. However, mock-ups or prototypes are still required in the evaluation processes of designability and operability of products because the evaluation of the operations of real products is essential. It is however time-consuming and costly to make prototypes or to develop trial products for evaluation. For this problem, considerable studies have been conducted on the use of mixed reality technology by overlaying an image of the design model onto a physical model using a Head-Mounted Display (HMD) to evaluate the designability and operability of a product. Such technology reduces the need for making physical mock-ups (prototypes and trial products), but HMDs have drawbacks such as causing motion sickness and physical weight, bulkiness of the display, and high costs. In this paper, a method using projectors is proposed to establish mixed reality technology which does not have the drawbacks of HMDs. A mixed reality system was constructed according to the proposed method, and applied for evaluating designability and operability of products without physical mock-ups. In the mixed reality space built by the system, the functions of a product can be held in the hand as if they were real products.


2021 ◽  
Vol 2 ◽  
Author(s):  
Prasanth Sasikumar ◽  
Soumith Chittajallu ◽  
Navindd Raj ◽  
Huidong Bai ◽  
Mark Billinghurst

Conventional training and remote collaboration systems allow users to see each other’s faces, heightening the sense of presence while sharing content like videos or slideshows. However, these methods lack depth information and a free 3D perspective of the training content. This paper investigates the impact of volumetric playback in a Mixed Reality (MR) spatial training system. We describe the MR system in a mechanical assembly scenario that incorporates various instruction delivery cues. Building upon previous research, four spatial instruction cues were explored; “Annotation”, “Hand gestures”, “Avatar”, and “Volumetric playback”. Through two user studies that simulated a real-world mechanical assembly task, we found that the volumetric visual cue enhanced spatial perception in the tested MR training tasks, exhibiting increased co-presence and system usability while reducing mental workload and frustration. We also found that the given tasks required less effort and mental load when eye gaze was incorporated. Eye gaze on its own was not perceived to be very useful, but it helped to compliment the hand gesture cues. Finally, we discuss limitations, future work and potential applications of our system.


2021 ◽  
Author(s):  
Hye Jin Kim

<p><b>Telepresence systems enable people to feel present in a remote space while their bodies remain in their local space. To enhance telepresence, the remote environment needs to be captured and visualised in an immersive way. For instance, 360-degree videos (360-videos) shown on head-mounted displays (HMDs) provide high fidelity telepresence in a remote place. Mixed reality (MR) in 360-videos enables interactions with virtual objects blended in the captured remote environment while it allows telepresence only for a single user wearing HMD. For this reason, it has limitations when multiple users want to experience telepresence together and naturally collaborate within a teleported space. </b></p><p>This thesis presents TeleGate, a novel multi-user teleportation platform for remote collaboration in a MR space. TeleGate provides "semi-teleportation" into the MR space using large-scale displays, acting as a bridge between the local physical communication space and the remote collaboration space created by MR with captured 360-videos. Our proposed platform enables multi-user semi-teleportation to perform collaborative tasks in the remote MR collaboration (MRC) space while allowing for natural communication between collaborators in the same local physical space. </p><p>We implemented a working prototype of TeleGate and then conducted a user study to evaluate our concept of semi-teleportation. We measured the spatial presence, social presence while participants performed remote collaborative tasks in the MRC space. Additionally, we also explored the different control mechanisms within the platform in the remote MR collaboration scenario. </p><p>In conclusion, TeleGate enabled multiple co-located users to semi-teleport together using large-scale displays for remote collaboration in MR 360-videos.</p>


2019 ◽  
Vol 13 (4) ◽  
pp. 482-489 ◽  
Author(s):  
Fumiki Tanaka ◽  
Makoto Tsuchida ◽  
Masahiko Onosato ◽  
◽  

Virtual reality (VR), augmented reality (AR), and mixed reality technologies are utilized at various stages of product lifecycle. For products with long lifecycles such as bridges and dams, the maintenance and inspection stages are very important to keep the product safe and well-functioning. One of the advantages of VR/AR is the ability to add important information such as past inspection data. Past inspection information is summarized in a document consisting of the 2D sketches of bridge degradation drawings. However, this degradation sketch is in 2D, and it has no correspondence with the 3D world. In this study, we propose a method to associate important information of 2D sketches with a 3D industry foundation classes (IFC) model, which is a standardized computer aided design model. To display a VR image of a bridge during the inspection process, the proposed method is applied to the 3D IFC model of the bridge and 2D degradation sketch of the inspection report.


Sign in / Sign up

Export Citation Format

Share Document