scholarly journals INFORMATION ACQUISITION USING EYE-GAZE TRACKING FOR PERSON-FOLLOWING WITH MOBILE ROBOTS

2009 ◽  
Vol 06 (03) ◽  
pp. 147-157
Author(s):  
HEMIN OMER LATIF ◽  
NASSER SHERKAT ◽  
AHMAD LOTFI

In the effort of developing natural means for human-robot interaction (HRI) significant amount of research has been focusing on Person-Following (PF) for mobile robots. PF, which generally consists of detecting, recognizing and following people, is believed to be one of the required functionalities for most future robots that share their environments with their human companions. Research in this field is mostly directed towards fully automating this functionality, which makes the challenge even more tedious. Focusing on this challenge leads research to divert from other challenges that coexist in any PF system. A natural PF functionality consists of a number of tasks that are required to be implemented in the system. However, in more realistic life scenarios, not all the tasks required for PF need to be automated. Instead, some of these tasks can be operated by human operators and therefore require natural means of interaction and information acquisition. In order to highlight all the tasks that are believed to exist in any PF system, this paper introduces a novel taxonomy for PF. Also, in order to provide a natural means for HRI, TeleGaze is used for information acquisition in the implementation of the taxonomy. TeleGaze was previously developed by the authors as a means of natural HRI for teleoperation through eye-gaze tracking. Using TeleGaze in the aid of developing PF systems is believed to show the feasibility of achieving a realistic information acquisition in a natural way.

2019 ◽  
pp. 177
Author(s):  
Hanaa Mohsin Ahmed ◽  
Salma Hameedi Abdullah

Author(s):  
Christoph Bartneck ◽  
Michael J. Lyons

The human face plays a central role in most forms of natural human interaction so we may expect that computational methods for analysis of facial information, modeling of internal emotional states, and methods for graphical synthesis of faces and facial expressions will play a growing role in human-computer and human-robot interaction. However, certain areas of face-based HCI, such as facial expression recognition and robotic facial display have lagged others, such as eye-gaze tracking, facial recognition, and conversational characters. Our goal in this paper is to review the situation in HCI with regards to the human face, and to discuss strategies, which could bring more slowly developing areas up to speed. In particular, we are proposing the “The Art of the Soluble” as a strategy forward and provide examples that successfully applied this strategy.


2019 ◽  
Vol 1 (1) ◽  
pp. 37-53
Author(s):  
Kerstin Thurow ◽  
Lei Zhang ◽  
Hui Liu ◽  
Steffen Junginger ◽  
Norbert Stoll ◽  
...  

AbstractTransportation technologies for mobile robots include indoor navigation, intelligent collision avoidance and target manipulation. This paper discusses the research process and development of these interrelated technologies. An efficient multi-floor laboratory transportation system for mobile robots developed by the group at the Center for Life Science Automation (CELISCA) is then introduced. This system is integrated with the multi-floor navigation and intelligent collision avoidance systems, as well as a labware manipulation system. A multi-floor navigation technology is proposed, comprising sub-systems for mapping and localization, path planning, door control and elevator operation. Based on human–robot interaction technology, a collision avoidance system is proposed that improves the navigation of the robots and ensures the safety of the transportation process. Grasping and placing operation technologies using the dual arms of the robots are investigated and integrated into the multi-floor transportation system. The proposed transportation system is installed on the H20 mobile robots and tested at the CELISCA laboratory. The results show that the proposed system can ensure the mobile robots are successful when performing multi-floor laboratory transportation tasks.


Sensors ◽  
2021 ◽  
Vol 21 (5) ◽  
pp. 1571
Author(s):  
Andrea Bonci ◽  
Pangcheng David Cen Cheng ◽  
Marina Indri ◽  
Giacomo Nabissi ◽  
Fiorella Sibona

Perception capability assumes significant importance for human–robot interaction. The forthcoming industrial environments will require a high level of automation to be flexible and adaptive enough to comply with the increasingly faster and low-cost market demands. Autonomous and collaborative robots able to adapt to varying and dynamic conditions of the environment, including the presence of human beings, will have an ever-greater role in this context. However, if the robot is not aware of the human position and intention, a shared workspace between robots and humans may decrease productivity and lead to human safety issues. This paper presents a survey on sensory equipment useful for human detection and action recognition in industrial environments. An overview of different sensors and perception techniques is presented. Various types of robotic systems commonly used in industry, such as fixed-base manipulators, collaborative robots, mobile robots and mobile manipulators, are considered, analyzing the most useful sensors and methods to perceive and react to the presence of human operators in industrial cooperative and collaborative applications. The paper also introduces two proofs of concept, developed by the authors for future collaborative robotic applications that benefit from enhanced capabilities of human perception and interaction. The first one concerns fixed-base collaborative robots, and proposes a solution for human safety in tasks requiring human collision avoidance or moving obstacles detection. The second one proposes a collaborative behavior implementable upon autonomous mobile robots, pursuing assigned tasks within an industrial space shared with human operators.


Author(s):  
Margot M. E. Neggers ◽  
Raymond H. Cuijpers ◽  
Peter A. M. Ruijten ◽  
Wijnand A. IJsselsteijn

AbstractAutonomous mobile robots that operate in environments with people are expected to be able to deal with human proxemics and social distances. Previous research investigated how robots can approach persons or how to implement human-aware navigation algorithms. However, experimental research on how robots can avoid a person in a comfortable way is largely missing. The aim of the current work is to experimentally determine the shape and size of personal space of a human passed by a robot. In two studies, both a humanoid as well as a non-humanoid robot were used to pass a person at different sides and distances, after which they were asked to rate their perceived comfort. As expected, perceived comfort increases with distance. However, the shape was not circular: passing at the back of a person is more uncomfortable compared to passing at the front, especially in the case of the humanoid robot. These results give us more insight into the shape and size of personal space in human–robot interaction. Furthermore, they can serve as necessary input to human-aware navigation algorithms for autonomous mobile robots in which human comfort is traded off with efficiency goals.


2009 ◽  
Vol 30 (12) ◽  
pp. 1144-1150 ◽  
Author(s):  
Diego Torricelli ◽  
Michela Goffredo ◽  
Silvia Conforto ◽  
Maurizio Schmid

Sign in / Sign up

Export Citation Format

Share Document