scholarly journals Pressure distribution classification and segmentation of human hands in contact with the robot body

2020 ◽  
Vol 39 (6) ◽  
pp. 668-687
Author(s):  
Alessandro Albini ◽  
Giorgio Cannata

This article deals with the problem of the recognition of human hand touch by a robot equipped with large area tactile sensors covering its body. This problem is relevant in the domain of physical human–robot interaction for discriminating between human and non-human contacts and to trigger and to drive cooperative tasks or robot motions, or to ensure a safe interaction. The underlying assumption used in this article is that voluntary physical interaction tasks involve hand touch over the robot body, and therefore the capability to recognize hand contacts is a key element to discriminate a purposive human touch from other types of interaction. The proposed approach is based on a geometric transformation of the tactile data, formed by pressure measurements associated to a non-uniform cloud of 3D points ( taxels) spread over a non-linear manifold corresponding to the robot body, into tactile images representing the contact pressure distribution in two dimensions. Tactile images can be processed using deep learning algorithms to recognize human hands and to compute the pressure distribution applied by the various hand segments: palm and single fingers. Experimental results, performed on a real robot covered with robot skin, show the effectiveness of the proposed methodology. Moreover, to evaluate its robustness, various types of failures have been simulated. A further analysis concerning the transferability of the system has been performed, considering contacts occurring on a different sensorized robot part.

Author(s):  
AJung Moon ◽  
Shalaleh Rismani ◽  
H. F. Machiel Van der Loos

Abstract Purpose of Review To summarize the set of roboethics issues that uniquely arise due to the corporeality and physical interaction modalities afforded by robots, irrespective of the degree of artificial intelligence present in the system. Recent Findings One of the recent trends in the discussion of ethics of emerging technologies has been the treatment of roboethics issues as those of “embodied AI,” a subset of AI ethics. In contrast to AI, however, robots leverage human’s natural tendency to be influenced by our physical environment. Recent work in human-robot interaction highlights the impact a robot’s presence, capacity to touch, and move in our physical environment has on people, and helping to articulate the ethical issues particular to the design of interactive robotic systems. Summary The corporeality of interactive robots poses unique sets of ethical challenges. These issues should be considered in the design irrespective of and in addition to the ethics of artificial intelligence implemented in them.


Author(s):  
Mahdi Haghshenas-Jaryani ◽  
Muthu B. J. Wijesundara

This paper presents the development of a framework based on a quasi-statics concept for modeling and analyzing the physical human-robot interaction in soft robotic hand exoskeletons used for rehabilitation and human performance augmentation. This framework provides both forward and inverse quasi-static formulations for the interaction between a soft robotic digit and a human finger which can be used for the calculation of angular motions, interaction forces, actuation torques, and stiffness at human joints. This is achieved by decoupling the dynamics of the soft robotic digit and the human finger with similar interaction forces acting on both sides. The presented theoretical models were validated by a series of numerical simulations based on a finite element model which replicates similar human-robot interaction. The comparison of the results obtained for the angular motion, interaction forces, and the estimated stiffness at the joints indicates the accuracy and effectiveness of the quasi-static models for predicting the human-robot interaction.


2018 ◽  
Vol 23 (6) ◽  
pp. 2662-2670
Author(s):  
Kyeong Ha Lee ◽  
Seung Guk Baek ◽  
Hyuk Jin Lee ◽  
Hyouk Ryeol Choi ◽  
Hyungpil Moon ◽  
...  

2021 ◽  
pp. 027836492110536
Author(s):  
Niels Dehio ◽  
Joshua Smith ◽  
Dennis L. Wigand ◽  
Pouya Mohammadi ◽  
Michael Mistry ◽  
...  

Robotics research into multi-robot systems so far has concentrated on implementing intelligent swarm behavior and contact-less human interaction. Studies of haptic or physical human-robot interaction, by contrast, have primarily focused on the assistance offered by a single robot. Consequently, our understanding of the physical interaction and the implicit communication through contact forces between a human and a team of multiple collaborative robots is limited. We here introduce the term Physical Human Multi-Robot Collaboration (PHMRC) to describe this more complex situation, which we consider highly relevant in future service robotics. The scenario discussed in this article covers multiple manipulators in close proximity and coupled through physical contacts. We represent this set of robots as fingers of an up-scaled agile robot hand. This perspective enables us to employ model-based grasping theory to deal with multi-contact situations. Our torque-control approach integrates dexterous multi-manipulator grasping skills, optimization of contact forces, compensation of object dynamics, and advanced impedance regulation into a coherent compliant control scheme. For this to achieve, we contribute fundamental theoretical improvements. Finally, experiments with up to four collaborative KUKA LWR IV+ manipulators performed both in simulation and real world validate the model-based control approach. As a side effect, we notice that our multi-manipulator control framework applies identically to multi-legged systems, and we execute it also on the quadruped ANYmal subject to non-coplanar contacts and human interaction.


2018 ◽  
Vol 9 (1) ◽  
pp. 221-234 ◽  
Author(s):  
João Avelino ◽  
Tiago Paulino ◽  
Carlos Cardoso ◽  
Ricardo Nunes ◽  
Plinio Moreno ◽  
...  

Abstract Handshaking is a fundamental part of human physical interaction that is transversal to various cultural backgrounds. It is also a very challenging task in the field of Physical Human-Robot Interaction (pHRI), requiring compliant force control in order to plan the arm’s motion and for a confident, but at the same time pleasant grasp of the human user’s hand. In this paper,we focus on the study of the hand grip strength for comfortable handshakes and perform three sets of physical interaction experiments between twenty human subjects in the first experiment, thirty-five human subjects in the second one, and thirty-eight human subjects in the third one. Tests are made with a social robot whose hands are instrumented with tactile sensors that provide skin-like sensation. From these experiments, we: (i) learn the preferred grip closure according to each user group; (ii) analyze the tactile feedback provided by the sensors for each closure; (iii) develop and evaluate the hand grip controller based on previous data. In addition to the robot-human interactions, we also learn about the robot executed handshake interactions with inanimate objects, in order to detect if it is shaking hands with a human or an inanimate object. This work adds physical human-robot interaction to the repertory of social skills of our robot, fulfilling a demand previously identified by many users of the robot.


Author(s):  
Adhau P ◽  
◽  
Kadwane S. G ◽  
Shital Telrandhe ◽  
Rajguru V. S ◽  
...  

Human robot interaction have been ever the topic of research to research scholars owing to its importance to help humanity. Robust human interacting robot where commands from Electromyogram (EMG) signals is recently being investigated. This article involves study of motions a system that allows signals recorded directly from a human body and thereafter can be used for control of a small robotic arm. The various gestures are recognized by placing the electrodes or sensors on the human hand. These gestures are then identified by using neural network. The neural network will thus train the signals. The offline control of the arm is done by controlling the motors of the robotic arm.


Author(s):  
Sukhdeep S. Dhami ◽  
Ashutosh Sharma ◽  
Rohit Kumar ◽  
Parveen Kalra

The number of industrial and household robots is fast increasing. A simpler human-robot interaction is preferred in household robotic applications as well as in hazardous environments. Gesture based control of robots is a step in this direction. In this work, a virtual model of a 3-DOF robotic manipulator is developed using V-Realm Builder in MATLAB and the mathematical models of forward and inverse kinematics of the manipulator are coded in MATLAB/Simulink software. Human hand gestures are captured using a smartphone with accelerometer and orientation sensors. A wireless interface is provided for transferring smartphone sensory data to a laptop running MATLAB/Simulink software. The hand gestures are used as reference signal for moving the wrist of the robot. A user interface shows the instantaneous joint angles of robot manipulator and spatial coordinates of robot wrist. This simple yet effective tool aids in learning a number of aspects of robotics and mechatronics. The animated graphical model of the manipulator provides a better understanding of forward and inverse kinematics of robot manipulator. The robot control using hand gestures generates curiosity in student about interfacing of hardware with computer. It may also stimulate new ideas in students to develop virtual learning tools.


Author(s):  
Nahian Rahman ◽  
Carlo Canali ◽  
Darwin G. Caldwell ◽  
Ferdinando Cannella

Dexterous gripper requirements, such as in-hand manipulation is a capability on which human hands are unique at; numerous number of sensors, degree of freedom, adaptability to deal with plurality of object of our hand motivate the researchers to replicate these abilities in robotic grippers. Developments of gripper or grasping devices have been addressed from many perspectives: the use of materials in the gripper synthesis, such as rigid or flexible, the approach of control, use of under-actuated mechanism and so on. Mathematical formulation of grasp modeling, manipulation are also addressed; however, due to the presence non-holonomic motion, it is difficult to replicate the behaviors (achieved in model) in a physical gripper. Also, achieving skills similar to human hand urge to use soft or non rigid material in the gripper design, which is contrary to speed and precision requirements in an industrial gripper. In this dilemma, this paper addresses the problem by developing modular finger approach. The modular finger is built by two well known mechanisms, and exploiting such modular finger in different numbers in a gripper arrangement can solve many rising issues of manipulation.


2021 ◽  
Vol 3 ◽  
Author(s):  
Alberto Martinetti ◽  
Peter K. Chemweno ◽  
Kostas Nizamis ◽  
Eduard Fosch-Villaronga

Policymakers need to consider the impacts that robots and artificial intelligence (AI) technologies have on humans beyond physical safety. Traditionally, the definition of safety has been interpreted to exclusively apply to risks that have a physical impact on persons’ safety, such as, among others, mechanical or chemical risks. However, the current understanding is that the integration of AI in cyber-physical systems such as robots, thus increasing interconnectivity with several devices and cloud services, and influencing the growing human-robot interaction challenges how safety is currently conceptualised rather narrowly. Thus, to address safety comprehensively, AI demands a broader understanding of safety, extending beyond physical interaction, but covering aspects such as cybersecurity, and mental health. Moreover, the expanding use of machine learning techniques will more frequently demand evolving safety mechanisms to safeguard the substantial modifications taking place over time as robots embed more AI features. In this sense, our contribution brings forward the different dimensions of the concept of safety, including interaction (physical and social), psychosocial, cybersecurity, temporal, and societal. These dimensions aim to help policy and standard makers redefine the concept of safety in light of robots and AI’s increasing capabilities, including human-robot interactions, cybersecurity, and machine learning.


2020 ◽  
Author(s):  
Mehmet Ismet Can Dede ◽  
Gokhan Kiper ◽  
Tolga Ayav ◽  
Barbaros Özdemirel ◽  
Enver Tatlicioglu ◽  
...  

Abstract Endoscopic endonasal surgery is a commonly practiced minimally invasive neurosurgical operation for the treatment of a wide range of skull base pathologies including pituitary tumors. A common shortcoming of this surgery is the necessity of a third hand when the endoscope has to be handled to allow active use of both hands of the main surgeon. The robot surgery assistant NeuRoboScope system has been developed to take over the endoscope from the main surgeon's hand while providing the surgeon with the necessary means of controlling the location and direction of the endoscope. One of the main novelties of the NeuRoboScope system is its human-robot interface designs which regulate and facilitate the interaction between the surgeon and the robot assistant. The human-robot interaction design of the NeuRoboScope system is investigated in two domains: direct physical interaction and master-slave teleoperation. The user study indicating the learning curve and ease of use of the master-slave teleoperation is given and this paper is concluded via providing the reader with an outlook of possible new human-robot interfaces for the robot assisted surgery systems.


Sign in / Sign up

Export Citation Format

Share Document