User Behaviour Analysis of Mixed Reality Remote Collaboration with a Hybrid View Interface

Author(s):  
Lei Gao ◽  
Huidong Bai ◽  
Mark Billinghurst ◽  
Robert W. Lindeman
Author(s):  
Peng Wang ◽  
Xiaoliang Bai ◽  
Mark Billinghurst ◽  
Shusheng Zhang ◽  
Sili Wei ◽  
...  

2020 ◽  
Vol 135 ◽  
pp. 168-182
Author(s):  
Christoph Link ◽  
Christoph Strasser ◽  
Michael Hinterreiter

2021 ◽  
Vol 2 ◽  
Author(s):  
Prasanth Sasikumar ◽  
Soumith Chittajallu ◽  
Navindd Raj ◽  
Huidong Bai ◽  
Mark Billinghurst

Conventional training and remote collaboration systems allow users to see each other’s faces, heightening the sense of presence while sharing content like videos or slideshows. However, these methods lack depth information and a free 3D perspective of the training content. This paper investigates the impact of volumetric playback in a Mixed Reality (MR) spatial training system. We describe the MR system in a mechanical assembly scenario that incorporates various instruction delivery cues. Building upon previous research, four spatial instruction cues were explored; “Annotation”, “Hand gestures”, “Avatar”, and “Volumetric playback”. Through two user studies that simulated a real-world mechanical assembly task, we found that the volumetric visual cue enhanced spatial perception in the tested MR training tasks, exhibiting increased co-presence and system usability while reducing mental workload and frustration. We also found that the given tasks required less effort and mental load when eye gaze was incorporated. Eye gaze on its own was not perceived to be very useful, but it helped to compliment the hand gesture cues. Finally, we discuss limitations, future work and potential applications of our system.


Sign in / Sign up

Export Citation Format

Share Document