The Impact of Gaze Cues in Mixed Reality Collaborations

Author(s):  
Allison Jing
Keyword(s):  
2020 ◽  
Vol 4 (4) ◽  
pp. 78
Author(s):  
Andoni Rivera Pinto ◽  
Johan Kildal ◽  
Elena Lazkano

In the context of industrial production, a worker that wants to program a robot using the hand-guidance technique needs that the robot is available to be programmed and not in operation. This means that production with that robot is stopped during that time. A way around this constraint is to perform the same manual guidance steps on a holographic representation of the digital twin of the robot, using augmented reality technologies. However, this presents the limitation of a lack of tangibility of the visual holograms that the user tries to grab. We present an interface in which some of the tangibility is provided through ultrasound-based mid-air haptics actuation. We report a user study that evaluates the impact that the presence of such haptic feedback may have on a pick-and-place task of the wrist of a holographic robot arm which we found to be beneficial.


Author(s):  
Ingrid Richardson ◽  
Larissa Hjorth

This article provides a critical overview of mobile gaming, from discrete offline casual games to location-based, mixed reality, cross-platform, and urban games and, more recently, the array of downloadable playful and social applications for the touchscreen smartphone and handheld tablet or iPad. The discussion begins by casting mobile games as one of the most significant trajectories of an emerging app-based media ecology. The authors consider the way mobile gaming and play have become intrinsic to our everyday practices and challenge the distinction that is often made between casual and hardcore (or ‘real') gamers. The article then explores how location-based mobile games generate hybrid experiences of place and presence, requiring the player to integrate their own situated and embodied perception of the world with dynamic GPS-enabled information. Finally, the overview turns to the relation between mobile media and social media games—which include those mobile games and apps that embed social networking and sharing features into the game or games accessed and played through social networking services via a mobile device. On a broader scale—in terms of the impact of mobile games on our daily lives—the proliferation of mobile interfaces, games, and playful apps is playing a key role in what is termed the ‘lusory' or playful turn in culture.


Author(s):  
Fil J. Arenas ◽  
Andrew S. Clayton ◽  
Kate D. Simmons

Several schools and colleges under Air University have found utility in using a mixed-reality approach to developing leadership acumen in this unique risk-free environment. This chapter will describe the power of collaboration between Air University and Auburn University at Montgomery while demonstrating the impact of this mixed-reality approach on developing military leaders. These mixed-reality exercises (MRXs) leverage an environment that establishes a practical laboratory for developing leaders to interact with avatars using a combination of virtual immersion and human intelligence (live simulation engagement). This innovative approach to “real play” allows real-time learning to take place in real time through virtual immersion.


2021 ◽  
Vol 2 ◽  
Author(s):  
Prasanth Sasikumar ◽  
Soumith Chittajallu ◽  
Navindd Raj ◽  
Huidong Bai ◽  
Mark Billinghurst

Conventional training and remote collaboration systems allow users to see each other’s faces, heightening the sense of presence while sharing content like videos or slideshows. However, these methods lack depth information and a free 3D perspective of the training content. This paper investigates the impact of volumetric playback in a Mixed Reality (MR) spatial training system. We describe the MR system in a mechanical assembly scenario that incorporates various instruction delivery cues. Building upon previous research, four spatial instruction cues were explored; “Annotation”, “Hand gestures”, “Avatar”, and “Volumetric playback”. Through two user studies that simulated a real-world mechanical assembly task, we found that the volumetric visual cue enhanced spatial perception in the tested MR training tasks, exhibiting increased co-presence and system usability while reducing mental workload and frustration. We also found that the given tasks required less effort and mental load when eye gaze was incorporated. Eye gaze on its own was not perceived to be very useful, but it helped to compliment the hand gesture cues. Finally, we discuss limitations, future work and potential applications of our system.


2021 ◽  
Vol 4 ◽  
Author(s):  
Basel Alhaji ◽  
Michael Prilla ◽  
Andreas Rausch

Trust is the foundation of successful human collaboration. This has also been found to be true for human-robot collaboration, where trust has also influence on over- and under-reliance issues. Correspondingly, the study of trust in robots is usually concerned with the detection of the current level of the human collaborator trust, aiming at keeping it within certain limits to avoid undesired consequences, which is known as trust calibration. However, while there is intensive research on human-robot trust, there is a lack of knowledge about the factors that affect it in synchronous and co-located teamwork. Particularly, there is hardly any knowledge about how these factors impact the dynamics of trust during the collaboration. These factors along with trust evolvement characteristics are prerequisites for a computational model that allows robots to adapt their behavior dynamically based on the current human trust level, which in turn is needed to enable a dynamic and spontaneous cooperation. To address this, we conducted a two-phase lab experiment in a mixed-reality environment, in which thirty-two participants collaborated with a virtual CoBot on disassembling traction batteries in a recycling context. In the first phase, we explored the (dynamics of) relevant trust factors during physical human-robot collaboration. In the second phase, we investigated the impact of robot’s reliability and feedback on human trust in robots. Results manifest stronger trust dynamics while dissipating than while accumulating and highlight different relevant factors as more interactions occur. Besides, the factors that show relevance as trust accumulates differ from those appear as trust dissipates. We detected four factors while trust accumulates (perceived reliability, perceived dependability, perceived predictability, and faith) which do not appear while it dissipates. This points to an interesting conclusion that depending on the stage of the collaboration and the direction of trust evolvement, different factors might shape trust. Further, the robot’s feedback accuracy has a conditional effect on trust depending on the robot’s reliability level. It preserves human trust when a failure is expected but does not affect it when the robot works reliably. This provides a hint to designers on when assurances are necessary and when they are redundant.


2020 ◽  
Vol 312 ◽  
pp. 04001
Author(s):  
Abhinesh Prabhakaran ◽  
Abdul-Majeed Mahamadu ◽  
Lamine Mahdjoubi ◽  
Patrick Manu

Building Information Modelling (BIM) and its associated technologies have proved to be one of the most promising developments in the Architectural, Engineering and Construction (AEC) industry. Over the past few decades, the AEC sector has been restricted in its communication of design as a result of single interface methods based on 2D and 3D visualization of information. Thus, most issues with respect to construction are identified fairly late, resulting in costly changes. With the introduction of BIM, many other approaches to data visualization can be leveraged including Mixed Reality (MR) applications for the virtual representation of spaces and objects beyond 3D. MR offers a revolution in the virtual representation of objects and space through context awareness as well as the incorporation of information beyond 3D offering countless opportunities for more effective design visualization and coordination. Despite the capability of MR, however, few examples exist of its application to design coordination in the AEC. In addressing this gap this study proposes a novel methodology for the application of MR in design coordination as well as investigates the impact of introducing MR into BIM workflow with a focus on the identification and avoidance of clashes. A prototypical model of the MR design coordination is presented and discussed. Findings indicate that MR improves design productivity and quality but also highlights potential infrastructure issues inhibiting the mainstreaming of MR for design practice.


Author(s):  
Twyla Perryman ◽  
Carlie Sandefur ◽  
Chelsea T. Morris

Purpose Simulation is increasingly becoming a valuable tool for training and educating students in communication sciences and disorders (CSD). The purpose of this study is to examine the impact of a mixed-reality simulation on CSD students' perceptions of their ability to apply clinical and counseling skills. Additionally, this study sought to investigate the overall efficacy and acceptance of this type of clinical simulation experience for undergraduate CSD students. Method A total of 29 undergraduate students participated in a clinical simulation experience that used actor-controlled avatars in a mixed-reality simulation environment to practice collecting case history information and delivering diagnostic news to parents of a child client. All students completed pre- and postsimulation questionnaires that rated the impact of the clinical simulation experience on their confidence in demonstrating targeted clinical skills and their general attitudes about their participation. Additionally, five lead participants participated in follow-up interviews to gather data to best describe students' perceptions. Quantitative and qualitative data were recorded and analyzed. Results The majority of the students exhibited positive attitudes toward the mixed-reality clinical simulation experience and reported an increase in their perceived ability to apply several counseling skills (e.g., listening and selective feedback) following the session. Analysis showed that the perceived confidence levels on seven out of the 17 targeted skill items increased on the postevent questionnaire to a level of statistical significance and that the simulation experience was described as meaningful and supportive for increasing confidence. Conclusions Mixed-reality clinical simulation may be a useful tool for teaching interpersonal communication and counseling skills for students, including undergraduates, in CSD. Additionally, the use of mixed-reality technology in this study produced similar results seen with other clinical simulation methods such as standardized patients or computer-based simulations.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Michael Tscholl ◽  
Jason Morphew ◽  
Robb Lindgren

Purpose This study aims to advance the proposal to use immersive virtual learning environments to stimulate and reveal deep-seated knowledge about science, giving instructors and researchers unique possibilities for assessing and identifying intuitive physical science knowledge. Aside from the ability to present rich and dynamic stimuli, these environments afford bodily enactment of people’s understanding, which draws less from declarative knowledge stores and more from everyday experiences with the physical world. Design/methodology/approach The authors ground their proposal in a critical review of the impact of stimulus and task characteristics of traditional physics inventories. Using a grounded theory approach, the authors present classifications and interpretations of observed bodily enactments of physics understandings in a study where participants enacted their understanding of force and motion of space in an immersive, interactive mixed reality (MR) environment. Findings The authors find that instances of these action categories can be interpreted as relating to underlying knowledge, often identified by other studies. The authors thus replicate a number of prior findings, which provide evidence to establish validation for using MR simulation as a tool for identifying people’s physical intuitions. Research limitations/implications This study targeted only a few specific physical science scenarios. Further, while a number of key insights about student knowledge came from the analysis, many of the observations are mere leads in need of further investigation and interpretation rather than core findings. Originality/value Immersive digital learning environments are primarily used for instruction. The authors propose to use and design them for assessment as well. This paper should prompt more research and development in this direction.


2016 ◽  
Vol 4 (3) ◽  
pp. 203-216
Author(s):  
Jeffrey Haber ◽  
Joon Chung

Multi-touch computer inputs allow users to interact with a virtual environment through the use of gesture commands on a monitor instead of a mouse and keyboard. This style of input is easy for the human mind to adapt to because gestures directly reflect how one interacts with the natural environment. This paper presents and assesses a personal-computer-based unmanned aerial vehicle ground control station that utilizes multi-touch gesture inputs and system reconfigurability to enhance operator performance. The system was developed at Ryerson University’s Mixed-Reality Immersive Motion Simulation Laboratory using commercial-off-the-shelf Presagis software. The ground control station was then evaluated using NASA’s task load index to determine if the inclusion of multi-touch gestures and reconfigurability provided an improvement in operator workload over the more traditional style of mouse and keyboard inputs. To conduct this assessment, participants were tasked with flying a simulated aircraft through a specified number of waypoints, and had to utilize a payload controller within a predetermined area. The task load index results from these flight tests have initially shown that the developed touch-capable ground control station improved operator workload while reducing the impact of all six related human factors.


Sign in / Sign up

Export Citation Format

Share Document