scholarly journals Modelling and Control of a 2-DOF Robot Arm with Elastic Joints for Safe Human-Robot Interaction

2021 ◽  
Vol 8 ◽  
Author(s):  
Hua Minh Tuan ◽  
Filippo Sanfilippo ◽  
Nguyen Vinh Hao

Collaborative robots (or cobots) are robots that can safely work together or interact with humans in a common space. They gradually become noticeable nowadays. Compliant actuators are very relevant for the design of cobots. This type of actuation scheme mitigates the damage caused by unexpected collision. Therefore, elastic joints are considered to outperform rigid joints when operating in a dynamic environment. However, most of the available elastic robots are relatively costly or difficult to construct. To give researchers a solution that is inexpensive, easily customisable, and fast to fabricate, a newly-designed low-cost, and open-source design of an elastic joint is presented in this work. Based on the newly design elastic joint, a highly-compliant multi-purpose 2-DOF robot arm for safe human-robot interaction is also introduced. The mechanical design of the robot and a position control algorithm are presented. The mechanical prototype is 3D-printed. The control algorithm is a two loops control scheme. In particular, the inner control loop is designed as a model reference adaptive controller (MRAC) to deal with uncertainties in the system parameters, while the outer control loop utilises a fuzzy proportional-integral controller to reduce the effect of external disturbances on the load. The control algorithm is first validated in simulation. Then the effectiveness of the controller is also proven by experiments on the mechanical prototype.

Symmetry ◽  
2018 ◽  
Vol 10 (12) ◽  
pp. 680
Author(s):  
Ethan Jones ◽  
Winyu Chinthammit ◽  
Weidong Huang ◽  
Ulrich Engelke ◽  
Christopher Lueg

Control of robot arms is often required in engineering and can be performed by using different methods. This study examined and symmetrically compared the use of a controller, eye gaze tracker and a combination thereof in a multimodal setup for control of a robot arm. Tasks of different complexities were defined and twenty participants completed an experiment using these interaction modalities to solve the tasks. More specifically, there were three tasks: the first was to navigate a chess piece from a square to another pre-specified square; the second was the same as the first task, but required more moves to complete; and the third task was to move multiple pieces to reach a solution to a pre-defined arrangement of the pieces. Further, while gaze control has the potential to be more intuitive than a hand controller, it suffers from limitations with regard to spatial accuracy and target selection. The multimodal setup aimed to mitigate the weaknesses of the eye gaze tracker, creating a superior system without simply relying on the controller. The experiment shows that the multimodal setup improves performance over the eye gaze tracker alone ( p < 0.05 ) and was competitive with the controller only setup, although did not outperform it ( p > 0.05 ).


Sensor Review ◽  
2015 ◽  
Vol 35 (3) ◽  
pp. 244-250 ◽  
Author(s):  
Pedro Neto ◽  
Nuno Mendes ◽  
A. Paulo Moreira

Purpose – The purpose of this paper is to achieve reliable estimation of yaw angles by fusing data from low-cost inertial and magnetic sensing. Design/methodology/approach – In this paper, yaw angle is estimated by fusing inertial and magnetic sensing from a digital compass and a gyroscope, respectively. A Kalman filter estimates the error produced by the gyroscope. Findings – Drift effect produced by the gyroscope is significantly reduced and, at the same time, the system has the ability to react quickly to orientation changes. The system combines the best of each sensor, the stability of the magnetic sensor and the fast response of the inertial sensor. Research limitations/implications – The system does not present a stable behavior in the presence of large vibrations. Considerable calibration efforts are needed. Practical implications – Today, most of human–robot interaction technologies need to have the ability to estimate orientation, especially yaw angle, from small-sized and low-cost sensors. Originality/value – Existing methods for inertial and magnetic sensor fusion are combined to achieve reliable estimation of yaw angle. Experimental tests in a human–robot interaction scenario show the performance of the system.


Author(s):  
Xiaoran Fan ◽  
Daewon Lee ◽  
Lawrence Jackel ◽  
Richard Howard ◽  
Daniel Lee ◽  
...  

Robotica ◽  
2019 ◽  
Vol 38 (10) ◽  
pp. 1807-1823 ◽  
Author(s):  
Leon Žlajpah ◽  
Tadej Petrič

SUMMARYIn this paper, we propose a novel unified framework for virtual guides. The human–robot interaction is based on a virtual robot, which is controlled by the admittance control. The unified framework combines virtual guides, control of the dynamic behavior, and path tracking. Different virtual guides and active constraints can be realized by using dead-zones in the position part of the admittance controller. The proposed algorithm can act in a changing task space and allows selection of the tasks-space and redundant degrees-of-freedom during the task execution. The admittance control algorithm can be implemented either on a velocity or on acceleration level. The proposed framework has been validated by an experiment on a KUKA LWR robot performing the Buzz-Wire task.


2009 ◽  
Vol 6 (3-4) ◽  
pp. 369-397 ◽  
Author(s):  
Kerstin Dautenhahn ◽  
Chrystopher L. Nehaniv ◽  
Michael L. Walters ◽  
Ben Robins ◽  
Hatice Kose-Bagci ◽  
...  

This paper provides a comprehensive introduction to the design of the minimally expressive robot KASPAR, which is particularly suitable for human–robot interaction studies. A low-cost design with off-the-shelf components has been used in a novel design inspired from a multi-disciplinary viewpoint, including comics design and Japanese Noh theatre. The design rationale of the robot and its technical features are described in detail. Three research studies will be presented that have been using KASPAR extensively. Firstly, we present its application in robot-assisted play and therapy for children with autism. Secondly, we illustrate its use in human–robot interaction studies investigating the role of interaction kinesics and gestures. Lastly, we describe a study in the field of developmental robotics into computational architectures based on interaction histories for robot ontogeny. The three areas differ in the way as to how the robot is being operated and its role in social interaction scenarios. Each will be introduced briefly and examples of the results will be presented. Reflections on the specific design features of KASPAR that were important in these studies and lessons learnt from these studies concerning the design of humanoid robots for social interaction will also be discussed. An assessment of the robot in terms of utility of the design for human–robot interaction experiments concludes the paper.


Sign in / Sign up

Export Citation Format

Share Document