An Advanced Human-Robot Interaction Interface for Collaborative Robotic Assembly Tasks

Author(s):  
Christos Papadopoulos ◽  
Ioannis Mariolis ◽  
Angeliki Topalidou-Kyniazopoulou ◽  
Grigorios Piperagkas ◽  
Dimosthenis Ioannidis ◽  
...  

This article introduces an advanced human-robot interaction (HRI) interface that allows teaching new assembly tasks to collaborative robotic systems. Using advanced perception and simulation technologies, the interface provides the proper tools for a non-expert user to teach a robot a new assembly task in a short amount of time. An RGBD camera is used to allow the user to demonstrate the task and the system extracts the needed information for the assembly to be simulated and performed by the robot, while the user guides the process. The HRI interface is integrated with the ROS framework and is built as a web application allowing operation through portable devices, such as a tablet PC. The interface is evaluated with user experience rating from test subjects that are requested to teach a folding assembly task to the robot.

2019 ◽  
pp. 794-812
Author(s):  
Christos Papadopoulos ◽  
Ioannis Mariolis ◽  
Angeliki Topalidou-Kyniazopoulou ◽  
Grigorios Piperagkas ◽  
Dimosthenis Ioannidis ◽  
...  

This article introduces an advanced human-robot interaction (HRI) interface that allows teaching new assembly tasks to collaborative robotic systems. Using advanced perception and simulation technologies, the interface provides the proper tools for a non-expert user to teach a robot a new assembly task in a short amount of time. An RGBD camera is used to allow the user to demonstrate the task and the system extracts the needed information for the assembly to be simulated and performed by the robot, while the user guides the process. The HRI interface is integrated with the ROS framework and is built as a web application allowing operation through portable devices, such as a tablet PC. The interface is evaluated with user experience rating from test subjects that are requested to teach a folding assembly task to the robot.


Author(s):  
Christos Papadopoulos ◽  
Ioannis Mariolis ◽  
Angeliki Topalidou-Kyniazopoulou ◽  
Grigorios Piperagkas ◽  
Dimosthenis Ioannidis ◽  
...  

Author(s):  
Roberta Etzi ◽  
Siyuan Huang ◽  
Giulia Wally Scurati ◽  
Shilei Lyu ◽  
Francesco Ferrise ◽  
...  

Abstract The use of collaborative robots in the manufacturing industry has widely spread in the last decade. In order to be efficient, the human-robot collaboration needs to be properly designed by also taking into account the operator’s psychophysiological reactions. Virtual Reality can be used as a tool to simulate human-robot collaboration in a safe and cheap way. Here, we present a virtual collaborative platform in which the human operator and a simulated robot coordinate their actions to accomplish a simple assembly task. In this study, the robot moved slowly or more quickly in order to assess the effect of its velocity on the human’s responses. Ten participants tested this application by using an Oculus Rift head-mounted display; ARTracking cameras and a Kinect system were used to track the operator’s right arm movements and hand gestures respectively. Performance, user experience, and physiological responses were recorded. The results showed that while humans’ performances and evaluations varied as a function of the robot’s velocity, no differences were found in the physiological responses. Taken together, these data highlight the relevance of the kinematic aspects of robot’s motion within a human-robot collaboration and provide valuable insights to further develop our virtual human-machine interactive platform.


2020 ◽  
Vol 10 (17) ◽  
pp. 5757
Author(s):  
Elena Laudante ◽  
Alessandro Greco ◽  
Mario Caterino ◽  
Marcello Fera

In current industrial systems, automation is a very important aspect for assessing manufacturing production performance related to working times, accuracy of operations and quality. In particular, the introduction of a robotic system in the working area should guarantee some improvements, such as risks reduction for human operators, better quality results and a speed increase for production processes. In this context, human action remains still necessary to carry out part of the subtasks, as in the case of composites assembly processes. This study aims at presenting a case study regarding the reorganization of the working activity carried out in workstation in which a composite fuselage panel is assembled in order to demonstrate, by means of simulation tool, that some of the advantages previously listed can be achieved also in aerospace industry. In particular, an entire working process for composite fuselage panel assembling will be simulated and analyzed in order to demonstrate and verify the applicability and effectiveness of human–robot interaction (HRI), focusing on working times and ergonomics and respecting the constraints imposed by standards ISO 10218 and ISO TS 15066. Results show the effectiveness of HRI both in terms of assembly performance, by reducing working times and ergonomics—for which the simulation provides a very low risk index.


Procedia CIRP ◽  
2019 ◽  
Vol 81 ◽  
pp. 1429-1434 ◽  
Author(s):  
Niki Kousi ◽  
Christos Stoubos ◽  
Christos Gkournelos ◽  
George Michalos ◽  
Sotiris Makris

Author(s):  
Mingdong Tang ◽  
Youlin Gu ◽  
Yunjian Zhang ◽  
Shigang Wang

Purpose The purpose of this paper is to present a dual manipulator system for aloft hot-line assembly tasks of connection fittings in 110-kv intelligent substation, which is significant to the research on hot-line working robots. Design/methodology/approach This paper addresses the challenges of the task and presents a dual manipulator system which can overcome these challenges to realize the robotic assembly of connection fittings in narrow space without impacting the safe distance of both phase to phase and phase to ground. Two manipulators share a same global reference coordinate. The mission of Manipulator 1 is to position the fixed part of connection fittings and screw the bolts on it. Visual computing provides the approximately position for the end-effector of Manipulator 2, after which The Manipulator 2 carries the removable part of connection fittings to this position. Then, the assembly task could be completed with the posture of the Manipulator 2 adjusted following the guidance by force-position control. Findings The dual manipulator system can position the target under different illumination conditions and complete fast assembly of connect fittings in 110-kV substation. No strong arc discharge or surface erosion phenomenon has been observed. Practical implications This dual manipulator system will be particularly useful for the hot-line assembly of connection fittings in 110-kv intelligent substation, as well as some assembly tasks where uncertain target position and complex contact surface such as cylindrical hole is involved. Originality/value This study presents a dual manipulator system used by a field robot working in 110-kv intelligent substation. The system is able to achieve the connection fittings assembly task under energized simulation experimental system. Unlike other peg-in-hole assembly strategy, it does not require high stability of manipulator or plane contact surface around the hole.


Sensors ◽  
2019 ◽  
Vol 19 (20) ◽  
pp. 4586 ◽  
Author(s):  
Chunxu Li ◽  
Ashraf Fahmy ◽  
Johann Sienz

In this paper, the application of Augmented Reality (AR) for the control and adjustment of robots has been developed, with the aim of making interaction and adjustment of robots easier and more accurate from a remote location. A LeapMotion sensor based controller has been investigated to track the movement of the operator hands. The data from the controller allows gestures and the position of the hand palm’s central point to be detected and tracked. A Kinect V2 camera is able to measure the corresponding motion velocities in x, y, z directions after our investigated post-processing algorithm is fulfilled. Unreal Engine 4 is used to create an AR environment for the user to monitor the control process immersively. Kalman filtering (KF) algorithm is employed to fuse the position signals from the LeapMotion sensor with the velocity signals from the Kinect camera sensor, respectively. The fused/optimal data are sent to teleoperate a Baxter robot in real-time by User Datagram Protocol (UDP). Several experiments have been conducted to test the validation of the proposed method.


2019 ◽  
Vol 40 ◽  
pp. 541-547 ◽  
Author(s):  
Vladimír Tlach ◽  
Ivan Kuric ◽  
Zuzana Ságová ◽  
Ivan Zajačko

Author(s):  
Carlos Morato ◽  
Krishnanand Kaipa ◽  
Boxuan Zhao ◽  
Satyandra K. Gupta

In this paper, we propose an exteroceptive sensing based framework to achieve safe human-robot interaction during shared tasks. Our approach allows a human to operate in close proximity with the robot, while pausing the robot’s motion whenever a collision between the human and the robot is imminent. The human’s presence is sensed by a N-range sensor based system, which consists of multiple range sensors mounted at various points on the periphery of the work cell. Each range sensor is based on a Microsoft Kinect sensor. Each sensor observes the human and outputs a 20 DOF human model. Positional data of these models are fused together to generate a refined human model. Next, the robot and the human model are approximated by dynamic bounding spheres and the robot’s motion is controlled by tracking the collisions between these spheres. Whereas most previous exteroceptive methods relied on depth data from camera images, our approach is one of the first successful attempts to build an explicit human model online and use it to evaluate human-robot interference. Real-time behavior observed during experiments with a 5 DOF robot and a human safely performing shared assembly tasks validate our approach.


Sign in / Sign up

Export Citation Format

Share Document