scholarly journals Exploring Interaction Design Considerations for Trustworthy Language-Capable Robotic Wheelchairs in Virtual Reality

2020 ◽  
Author(s):  
Nicholas Woodward ◽  
Teresa Nguyen ◽  
Lixiao Zhu ◽  
Carter Fowler ◽  
Taewoo Kim ◽  
...  

In previous work, researchers in Human-Robot Interaction (HRI) have demonstrated that user trust in robots depends on effective and transparent communication. This may be particularly true for robots used for transportation, due to user reliance on such robots for physical movement and safety. In this paper, we present the design of an experiment examining the importance of proactive communication by robotic wheelchairs, as compared to non-vehicular mobile robots, within a Virtual Reality (VR) environment. Furthermore, we describe the specific advantages -- and limitations -- of conducting this type of HRI experiment in VR.

2020 ◽  
Author(s):  
Thomas Williams

In previous work, researchers in Human-Robot Interaction (HRI) have demonstrated that user trust in robots depends on effective and transparent communication. This may be particularly true forrobots used for transportation, due to user reliance on such robots for physical movement and safety. In this paper, we present the design of an experiment examining the importance of proactive communication by robotic wheelchairs, as compared to non-vehicular mobile robots, within a Virtual Reality (VR) environment. Furthermore, we describe the specific advantages – and limitations – of conducting this type of HRI experiment in VR.


2008 ◽  
Vol 5 (4) ◽  
pp. 235-241 ◽  
Author(s):  
Rajesh Elara Mohan ◽  
Carlos Antonio Acosta Calderon ◽  
Changjiu Zhou ◽  
Pik Kong Yue

In the field of human-computer interaction, the Natural Goals, Operators, Methods, and Selection rules Language (NGOMSL) model is one of the most popular methods for modelling knowledge and cognitive processes for rapid usability evaluation. The NGOMSL model is a description of the knowledge that a user must possess to operate the system represented as elementary actions for effective usability evaluations. In the last few years, mobile robots have been exhibiting a stronger presence in commercial markets and very little work has been done with NGOMSL modelling for usability evaluations in the human-robot interaction discipline. This paper focuses on extending the NGOMSL model for usability evaluation of human-humanoid robot interaction in the soccer robotics domain. The NGOMSL modelled human-humanoid interaction design of Robo-Erectus Junior was evaluated and the results of the experiments showed that the interaction design was able to find faults in an average time of 23.84 s. Also, the interaction design was able to detect the fault within the 60 s in 100% of the cases. The Evaluated Interaction design was adopted by our Robo-Erectus Junior version of humanoid robots in the RoboCup 2007 humanoid soccer league.


Author(s):  
Margot M. E. Neggers ◽  
Raymond H. Cuijpers ◽  
Peter A. M. Ruijten ◽  
Wijnand A. IJsselsteijn

AbstractAutonomous mobile robots that operate in environments with people are expected to be able to deal with human proxemics and social distances. Previous research investigated how robots can approach persons or how to implement human-aware navigation algorithms. However, experimental research on how robots can avoid a person in a comfortable way is largely missing. The aim of the current work is to experimentally determine the shape and size of personal space of a human passed by a robot. In two studies, both a humanoid as well as a non-humanoid robot were used to pass a person at different sides and distances, after which they were asked to rate their perceived comfort. As expected, perceived comfort increases with distance. However, the shape was not circular: passing at the back of a person is more uncomfortable compared to passing at the front, especially in the case of the humanoid robot. These results give us more insight into the shape and size of personal space in human–robot interaction. Furthermore, they can serve as necessary input to human-aware navigation algorithms for autonomous mobile robots in which human comfort is traded off with efficiency goals.


AI & Society ◽  
2021 ◽  
Author(s):  
Nora Fronemann ◽  
Kathrin Pollmann ◽  
Wulf Loh

AbstractTo integrate social robots in real-life contexts, it is crucial that they are accepted by the users. Acceptance is not only related to the functionality of the robot but also strongly depends on how the user experiences the interaction. Established design principles from usability and user experience research can be applied to the realm of human–robot interaction, to design robot behavior for the comfort and well-being of the user. Focusing the design on these aspects alone, however, comes with certain ethical challenges, especially regarding the user’s privacy and autonomy. Based on an example scenario of human–robot interaction in elder care, this paper discusses how established design principles can be used in social robotic design. It then juxtaposes these with ethical considerations such as privacy and user autonomy. Combining user experience and ethical perspectives, we propose adjustments to the original design principles and canvass our own design recommendations for a positive and ethically acceptable social human–robot interaction design. In doing so, we show that positive user experience and ethical design may be sometimes at odds, but can be reconciled in many cases, if designers are willing to adjust and amend time-tested design principles.


Author(s):  
Shiyang Dong ◽  
Takafumi Matsumaru

AbstractThis paper shows a novel walking training system for foot-eye coordination. To design customizable trajectories for different users conveniently in walking training, a new system which can track and record the actual walking trajectories by a tutor and can use these trajectories for the walking training by a trainee is developed. We set the four items as its human-robot interaction design concept: feedback, synchronization, ingenuity and adaptability. A foot model is proposed to define the position and direction of a foot. The errors in the detection method used in the system are less than 40 mm in position and 15 deg in direction. On this basis, three parts are structured to achieve the system functions: Trajectory Designer, Trajectory Viewer and Mobile Walking Trainer. According to the experimental results,we have confirmed the systemworks as intended and designed such that the steps recorded in Trajectory Designer could be used successfully as the footmarks projected in Mobile Walking Trainer and foot-eye coordination training would be conducted smoothly.


Author(s):  
Stefan Schiffer ◽  
Alexander Ferrein

In this work we report on our effort to design and implement an early introduction to basic robotics principles for children at kindergarten age.  The humanoid robot Pepper, which is a great platform for human-robot interaction experiments, was presenting the lecture by reading out the contents to the children making use of its speech synthesis capability.  One of the main challenges of this effort was to explain complex robotics contents in a way that pre-school children could follow the basic principles and ideas using examples from their world of experience. A quiz in a Runaround-game-show style after the lecture activated the children to recap the contents  they acquired about how mobile robots work in principle. Besides the thrill being exposed to a mobile robot that would also react to the children, they were very excited and at the same time very concentrated. What sets apart our effort from other work is that part of the lecturing is actually done by a robot itself and that a quiz at the end of the lesson is done using robots as well. To the best of our knowledge this is one of only few attempts to use Pepper not as a tele-teaching tool, but as the teacher itself in order to engage pre-school children with complex robotics contents. We  got very positive feedback from the children as well as from their educators.


Author(s):  
Roberta Etzi ◽  
Siyuan Huang ◽  
Giulia Wally Scurati ◽  
Shilei Lyu ◽  
Francesco Ferrise ◽  
...  

Abstract The use of collaborative robots in the manufacturing industry has widely spread in the last decade. In order to be efficient, the human-robot collaboration needs to be properly designed by also taking into account the operator’s psychophysiological reactions. Virtual Reality can be used as a tool to simulate human-robot collaboration in a safe and cheap way. Here, we present a virtual collaborative platform in which the human operator and a simulated robot coordinate their actions to accomplish a simple assembly task. In this study, the robot moved slowly or more quickly in order to assess the effect of its velocity on the human’s responses. Ten participants tested this application by using an Oculus Rift head-mounted display; ARTracking cameras and a Kinect system were used to track the operator’s right arm movements and hand gestures respectively. Performance, user experience, and physiological responses were recorded. The results showed that while humans’ performances and evaluations varied as a function of the robot’s velocity, no differences were found in the physiological responses. Taken together, these data highlight the relevance of the kinematic aspects of robot’s motion within a human-robot collaboration and provide valuable insights to further develop our virtual human-machine interactive platform.


Sign in / Sign up

Export Citation Format

Share Document