scholarly journals Human-Human Hand Interactions Aid Balance During Walking by Haptic Communication

2021 ◽  
Vol 8 ◽  
Author(s):  
Mengnan Wu ◽  
Luke Drnach ◽  
Sistania M. Bong ◽  
Yun Seong Song ◽  
Lena H. Ting

Principles from human-human physical interaction may be necessary to design more intuitive and seamless robotic devices to aid human movement. Previous studies have shown that light touch can aid balance and that haptic communication can improve performance of physical tasks, but the effects of touch between two humans on walking balance has not been previously characterized. This study examines physical interaction between two persons when one person aids another in performing a beam-walking task. 12 pairs of healthy young adults held a force sensor with one hand while one person walked on a narrow balance beam (2 cm wide x 3.7 m long) and the other person walked overground by their side. We compare balance performance during partnered vs. solo beam-walking to examine the effects of haptic interaction, and we compare hand interaction mechanics during partnered beam-walking vs. overground walking to examine how the interaction aided balance. While holding the hand of a partner, participants were able to walk further on the beam without falling, reduce lateral sway, and decrease angular momentum in the frontal plane. We measured small hand force magnitudes (mean of 2.2 N laterally and 3.4 N vertically) that created opposing torque components about the beam axis and calculated the interaction torque, the overlapping opposing torque that does not contribute to motion of the beam-walker’s body. We found higher interaction torque magnitudes during partnered beam-walking vs. partnered overground walking, and correlation between interaction torque magnitude and reductions in lateral sway. To gain insight into feasible controller designs to emulate human-human physical interactions for aiding walking balance, we modeled the relationship between each torque component and motion of the beam-walker’s body as a mass-spring-damper system. Our model results show opposite types of mechanical elements (active vs. passive) for the two torque components. Our results demonstrate that hand interactions aid balance during partnered beam-walking by creating opposing torques that primarily serve haptic communication, and our model of the torques suggest control parameters for implementing human-human balance aid in human-robot interactions.

Actuators ◽  
2022 ◽  
Vol 11 (1) ◽  
pp. 24
Author(s):  
Guan-Yang Liu ◽  
Yi Wang ◽  
Chao Huang ◽  
Chen Guan ◽  
Dong-Tao Ma ◽  
...  

The goal of haptic feedback in robotic teleoperation is to enable users to accurately feel the interaction force measured at the slave side and precisely understand what is happening in the slave environment. The accuracy of the feedback force describing the error between the actual feedback force felt by a user at the master side and the measured interaction force at the slave side is the key performance indicator for haptic display in robotic teleoperation. In this paper, we evaluate the haptic feedback accuracy in robotic teleoperation via experimental method. A special interface iHandle and two haptic devices, iGrasp-T and iGrasp-R, designed for robotic teleoperation are developed for experimental evaluation. The device iHandle integrates a high-performance force sensor and a micro attitude and heading reference system which can be used to identify human upper limb motor abilities, such as posture maintenance and force application. When a user is asked to grasp the iHandle and maintain a fixed position and posture, the fluctuation value of hand posture is measured to be between 2 and 8 degrees. Based on the experimental results, human hand tremble as input noise sensed by the haptic device is found to be a major reason that results in the noise of output force from haptic device if the spring-damping model is used to render feedback force. Therefore, haptic rendering algorithms should be independent of hand motion information to avoid input noise from human hand to the haptic control loop in teleoperation. Moreover, the iHandle can be fixed at the end effector of haptic devices; iGrasp-T or iGrasp-R, to measure the output force/torque from iGrasp-T or iGrasp-Rand to the user. Experimental results show that the accuracy of the output force from haptic device iGrasp-T is approximately 0.92 N, and using the force sensor in the iHandle can compensate for the output force inaccuracy of device iGrasp-T to 0.1 N. Using a force sensor as the feedback link to form a closed-loop feedback force control system is an effective way to improve the accuracy of feedback force and guarantee high-fidelity of feedback forces at the master side in robotic teleoperation.


2010 ◽  
Vol 31 (3) ◽  
pp. 380-384 ◽  
Author(s):  
Noah J. Rosenblatt ◽  
Mark D. Grabiner

2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Mitsuhiro Kamezaki ◽  
Yusuke Uehara ◽  
Kohga Azuma ◽  
Shigeki Sugano

AbstractDisaster response robots are expected to perform complicated tasks such as traveling over unstable terrain, climbing slippery steps, and removing heavy debris. To complete such tasks safely, the robots must obtain not only visual-perceptual information (VPI) such as surface shape but also the haptic-perceptual information (HPI) such as surface friction of objects in the environments. VPI can be obtained from laser sensors and cameras. In contrast, HPI can be basically obtained from only the results of physical interaction with the environments, e.g., reaction force and deformation. However, current robots do not have a function to estimate the HPI. In this study, we propose a framework to estimate such physically interactive parameters (PIPs), including hardness, friction, and weight, which are vital parameters for safe robot-environment interaction. For effective estimation, we define the ground (GGM) and object groping modes (OGM). The endpoint of the robot arm, which has a force sensor, actively touches, pushes, rubs, and lifts objects in the environment with a hybrid position/force control, and three kinds of PIPs are estimated from the measured reaction force and displacement of the arm endpoint. The robot finally judges the accident risk based on estimated PIPs, e.g., safe, attentional, or dangerous. We prepared environments that had the same surface shape but different hardness, friction, and weight. The experimental results indicated that the proposed framework could estimate PIPs adequately and was useful to judge the risk and safely plan tasks.


2020 ◽  
Vol 39 (6) ◽  
pp. 668-687
Author(s):  
Alessandro Albini ◽  
Giorgio Cannata

This article deals with the problem of the recognition of human hand touch by a robot equipped with large area tactile sensors covering its body. This problem is relevant in the domain of physical human–robot interaction for discriminating between human and non-human contacts and to trigger and to drive cooperative tasks or robot motions, or to ensure a safe interaction. The underlying assumption used in this article is that voluntary physical interaction tasks involve hand touch over the robot body, and therefore the capability to recognize hand contacts is a key element to discriminate a purposive human touch from other types of interaction. The proposed approach is based on a geometric transformation of the tactile data, formed by pressure measurements associated to a non-uniform cloud of 3D points ( taxels) spread over a non-linear manifold corresponding to the robot body, into tactile images representing the contact pressure distribution in two dimensions. Tactile images can be processed using deep learning algorithms to recognize human hands and to compute the pressure distribution applied by the various hand segments: palm and single fingers. Experimental results, performed on a real robot covered with robot skin, show the effectiveness of the proposed methodology. Moreover, to evaluate its robustness, various types of failures have been simulated. A further analysis concerning the transferability of the system has been performed, considering contacts occurring on a different sensorized robot part.


2011 ◽  
Vol 20 (6) ◽  
pp. 577-590
Author(s):  
Patrick J. Grabowski ◽  
Drew N. Rutherford ◽  
Andrea H. Mason

The modeling of human movement is vital for a complete understanding of complex human–computer interaction. As three-dimensional collaborative tangible user interfaces (TUIs) evolve, research is needed to understand how people physically interact with each other within a virtual environment. Previous study of physical collaboration in virtual environments has utilized Fitts' law to model gross upper-extremity movement in a passing task. However, no study has modeled passing tasks that require precision grasp with the human hand, an important feature of human–computer interaction in TUIs. The purpose of this study was to evaluate the validity of Fitts' law in modeling movement time for a precision passing task in a 3D TUI, and to assess the coordination between passer and receiver using kinematic parameters. In this experiment, 12 participants (six male, mean age 22.6 years) performed a prehensile passing task within a desktop virtual environment. Results detail the kinematic events required to achieve the necessary temporal and spatial coordination specific to the passing task. Further, results indicate that Fitts' model does not adequately explain movement time for this task (R 2 = .51). This finding challenges the external validity of previous results. We argue that the task-specific complexity of human neuromotor control should be considered when using predictive models in 3D TUI design.


2013 ◽  
Vol 4 (1) ◽  
pp. 1-9 ◽  
Author(s):  
Nayan M. Kakoty ◽  
Shyamanta M. Hazarika

AbstractThis paper presents a two layered control architecture - Superior hand control (SHC) followed by Local hand control (LHC) for an extreme upper limb prosthesis. The control architecture is for executing grasping operations involved in 70% of daily living activities. Forearm electromyogram actuated SHC is for recognition of user’s intended grasp. LHC control the fingers to be actuated for the recognized grasp. The finger actuation is controlled through a proportionalintegral- derivative controller customized with fingertip force sensor. LHC controls joint angles and velocities of the fingers in the prosthetic hand. Fingers in the prosthetic hand emulate the dynamic constraints of human hand fingers. The joint angle trajectories and velocity profiles of the prosthetic hand finger are in close approximation to those of the human finger


2019 ◽  
Author(s):  
Amir Boroomand-Tehrani ◽  
Andrew H. Huntley ◽  
David Jagroop ◽  
Jennifer L. Campos ◽  
Kara K. Patterson ◽  
...  

ABSTRACTRapid motor learning may occur in situations where individuals perceive a threat of injury if they do not perform a task well. This rapid motor learning may be facilitated by improved motor performance and, consequently, more errorless practice. As a first step towards understanding the role of perceived threat on rapid motor learning, the purpose of this study was to determine how performance of a motor task is affected in situations where perceived threat of injury is high. We hypothesized that perceived threat of injury in a virtual environment would result in improved performance of a walking task (i.e., walking on a narrow beam). Results demonstrated that increased perceived threat of injury yielded slightly greater, but not statistically significant, balance performance in virtual environments (median percentage of successful steps: 78.8%, 48.3%, and 55.2% in the real low-threat, virtual low-threat, and virtual high-threat environments, respectively). These results may be partially attributed to habituation to threat over time and practice. If implemented carefully, virtual reality technology can be an effective tool for investigating walking balance in environments that are perceived as threatening.


2021 ◽  
Vol 15 ◽  
Author(s):  
Spencer W. Jensen ◽  
John L. Salmon ◽  
Marc D. Killpack

In this paper, we analyze and report on observable trends in human-human dyads performing collaborative manipulation (co-manipulation) tasks with an extended object (object with significant length). We present a detailed analysis relating trends in interaction forces and torques with other metrics and propose that these trends could provide a way of improving communication and efficiency for human-robot dyads. We find that the motion of the co-manipulated object has a measurable oscillatory component. We confirm that haptic feedback alone represents a sufficient communication channel for co-manipulation tasks, however we find that the loss of visual and auditory channels has a significant effect on interaction torque and velocity. The main objective of this paper is to lay the essential groundwork in defining principles of co-manipulation between human dyads. We propose that these principles could enable effective and intuitive human-robot collaborative manipulation in future co-manipulation research.


2002 ◽  
Vol 14 (5) ◽  
pp. 432-438 ◽  
Author(s):  
Yusuke Maeda ◽  
◽  
Takayuki Hara ◽  
Tamio Arai

In this paper, a control method of robots for cooperative human-robot handling of an object is investigated. We propose estimating human motion using the minimum jerk model for smooth cooperation. Using a nonlinear least-squares method, we identify two parameters of a minimum-jerk trajectory of a human hand in real time. The estimated position of the human hand is used to determine the desired position of the end-effector of the manipulator in virtual compliance control. Motion estimation enables robots to coordinate actively even for unknown trajectories of manipulated objects that human partners intend. We implemented the proposed method on an industrial manipulator with a force sensor. In experiments of cooperative manipulation of a rubber pipe, the motion estimation improved human feeling in coordination. The improvement was quantitatively evaluated from the viewpoint of ""unnecessary energy transfer.""


2016 ◽  
Vol 860 ◽  
pp. 1-6 ◽  
Author(s):  
Md Shad Rahman ◽  
Rasel A. Sultan ◽  
N.M. Hasan

This system is designed for advance Robotic control. It based on sensor data acquisition and software data processing. With those systems controlling a robotic hand by hydraulic and electric means. It is separated by two different sections. First, data acquisition section with differential sensor data (Gyro sensor, Flex sensor, Pressure sensor). Second, software processed data application system consisting of robotic hand. Specialty of this system is it gives precise control of robotic arm following human hand movement. It also gives touch and pressure feelings in robotic hand. A lot of work can be done easily with the help of it. Like this system gives remote bomb disposal, hazardous environmental work remotely, remote operation, remote medical help and so on.


Sign in / Sign up

Export Citation Format

Share Document