Realization and Safety Measures of Patient Transfer by Nursing-Care Assistant Robot RIBA with Tactile Sensors

2011 ◽  
Vol 23 (3) ◽  
pp. 360-369 ◽  
Author(s):  
Toshiharu Mukai ◽  
◽  
Shinya Hirano ◽  
Hiromichi Nakashima ◽  
Yuki Sakaida ◽  
...  

In aging societies, there is a strong demand for robotics to tackle with problems caused by the aging population. Patient transfer, such as lifting and moving a bedridden patient from a bed to a wheelchair and back, is one of the most physically challenging tasks in nursing care, the burden of which should be reduced by the introduction of robot technologies. To this end, we have developed a new prototype robot named RIBA having human-type arms with tactile sensors. RIBA succeeded in transferring a human from a bed to a wheelchair and back. The tactile sensors play important roles in sensor feedback and detection of instructions from the operator. In this paper, after outlining the concept and specifications of RIBA, we will explain the tactile information processing, its application to tactile feedback and instruction detection, and safety measures to realize patient transfer. The results of patient transfer experiments are also reported.

2007 ◽  
Vol 1 (3) ◽  
pp. 217-224 ◽  
Author(s):  
Saeed Sokhanvar ◽  
Mohammadreza Ramezanifard ◽  
Javad Dargahi ◽  
Muthukumaran Packirisamy

Minimally invasive sugery (MIS) has increasingly been used in different surgical routines despite having significant shortcomings such as a lack of tactile feedback. Restoring this missing tactile information, particularly the information gained through tissue palpation, would be a significant enhancement to MIS capabilities. Tissue palpation is particularly important and commonly used in locating embedded lumps. The present study is inspired by this major limitation of the MIS procedure and is aimed at developing a system to reconstruct the lost palpation capability of surgeons in an effective way. By collecting necessary information on the size and location of hidden features using MIS graspers equipped with tactile sensors, the information can be processed and graphically rendered to the surgeon. Therefore, using the proposed system, surgeons can identify the presence or absence, location, and approximate size of hidden lumps simply by grasping the target organ with a smart endoscopic grasper. The results of the conducted experiments on the prototyped MIS graspers represented by graphical images are compared with those of the finite element models.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Yuxin Liu ◽  
Yuting Yin ◽  
Zhiwen Jiang ◽  
Shijie Guo

Patient transfer, such as carrying a bedridden patient from a bed to a pedestal pan or a wheelchair and back, is one of the most physically challenging tasks in nursing care facilities. To reduce the intensity of physical labor on nurses or caregivers, a piggyback transfer robot has been developed by imitating the motion when a person holds another person on his/her back. As the chest holder supports most of the weight of the care-receiver during transfer, a human-robot dynamic model was built to analyze the influences of the motion of the chest holder on comfort. Simulations and experiments were conducted, and the results demonstrated that the rotational motion of the chest holder is the key factor affecting comfort. A tactile-based impedance control law was developed to adjust the rotational motion. Subjective evaluations of ten healthy subjects showed that adjusting the rotational motion of the chest holder is a useful way to achieve a comfortable transfer.


2014 ◽  
Vol 26 (6) ◽  
pp. 743-749 ◽  
Author(s):  
Yuki Mori ◽  
◽  
Ryojun Ikeura ◽  
Ming Ding ◽  
◽  
...  

<div class=""abs_img""><img src=""[disp_template_path]/JRM/abst-image/00260006/07.jpg"" width=""300"" />Position estimation by forearms</div> For a robot that uses two arms to lift and transfer a care receiver from a bed to a wheelchair, we report a method of estimating the positioning of the care receiver. The maneuver for such a task involves a high DOF, and the robot is capable of executing the maneuver much like a human being. The care receiver may experience pain or become unstable when being carried, however, depending on the positioning of contact between the robot’s arms and the care receiver. For this reason, nursing care robots must be able to recognize the positioning of contact with the care receiver and either modify it or alert the operator if it is unsuitable. We use the information obtained by tactile sensors on the robot’s arms when making contact with the care receiver to estimate the latter’s positioning. By dividing a care receiver’s position on a bed into nine zones and applying machine learning to tactile sensor data and positioning, it is possible to estimate positioning highly accurately. </span>


2021 ◽  
Vol 6 (51) ◽  
pp. eabc8801
Author(s):  
Youcan Yan ◽  
Zhe Hu ◽  
Zhengbao Yang ◽  
Wenzhen Yuan ◽  
Chaoyang Song ◽  
...  

Human skin can sense subtle changes of both normal and shear forces (i.e., self-decoupled) and perceive stimuli with finer resolution than the average spacing between mechanoreceptors (i.e., super-resolved). By contrast, existing tactile sensors for robotic applications are inferior, lacking accurate force decoupling and proper spatial resolution at the same time. Here, we present a soft tactile sensor with self-decoupling and super-resolution abilities by designing a sinusoidally magnetized flexible film (with the thickness ~0.5 millimeters), whose deformation can be detected by a Hall sensor according to the change of magnetic flux densities under external forces. The sensor can accurately measure the normal force and the shear force (demonstrated in one dimension) with a single unit and achieve a 60-fold super-resolved accuracy enhanced by deep learning. By mounting our sensor at the fingertip of a robotic gripper, we show that robots can accomplish challenging tasks such as stably grasping fragile objects under external disturbance and threading a needle via teleoperation. This research provides new insight into tactile sensor design and could be beneficial to various applications in robotics field, such as adaptive grasping, dexterous manipulation, and human-robot interaction.


2021 ◽  
pp. 1-54
Author(s):  
Yuxin Liu ◽  
Shijie Guo ◽  
Yuting Yin ◽  
Zhiwen Jiang ◽  
Teng Liu

Abstract Patient transfer, such as lifting and moving a bedridden patient from a bed to a wheelchair or a pedestal pan, is one of the most physically challenging tasks in nursing care. Although many transfer devices have been developed, they are rarely used because of the large time consumption in performing transfer tasks and the lack of safety and comfortableness. We developed a piggyback transfer robot that can conduct patient transfer by imitating the motion when a person holds another person on his/her back. The robot consisted of a chest holder that moves like a human back. In this paper, we present an active stiffness control approach for the motion control of the chest holder, combined with a passive cushion, for lifting a care-receiver comfortably. A human-robot dynamic model was built and a subjective evaluation was conducted to optimize the parameters of both the active stiffness control and the passive cushion of the chest holder. The test results of 10 subjects demonstrated that the robot could transfer a subject safely and the combination of active stiffness and passive stiffness were essential to a comfortable transfer. The objective evaluation demonstrated that an active stiffness of k= 4 kPa/mm along with a passive stiffness lower than the stiffness of human chest was helpful for a comfort feeling.


2021 ◽  
Vol 5 (ISS) ◽  
pp. 1-17
Author(s):  
Yosra Rekik ◽  
Edward Lank ◽  
Adnane Guettaf ◽  
Prof. Laurent Grisoni

Alongside vision and sound, hardware systems can be readily designed to support various forms of tactile feedback; however, while a significant body of work has explored enriching visual and auditory communication with interactive systems, tactile information has not received the same level of attention. In this work, we explore increasing the expressivity of tactile feedback by allowing the user to dynamically select between several channels of tactile feedback using variations in finger speed. In a controlled experiment, we show that a user can learn the dynamics of eyes-free tactile channel selection among different channels, and can reliable discriminate between different tactile patterns during multi-channel selection with an accuracy up to 90% when using two finger speed levels. We discuss the implications of this work for richer, more interactive tactile interfaces.


2019 ◽  
Vol 4 (27) ◽  
pp. eaau8892 ◽  
Author(s):  
Edoardo D’Anna ◽  
Giacomo Valle ◽  
Alberto Mazzoni ◽  
Ivo Strauss ◽  
Francesco Iberite ◽  
...  

Current myoelectric prostheses allow transradial amputees to regain voluntary motor control of their artificial limb by exploiting residual muscle function in the forearm. However, the overreliance on visual cues resulting from a lack of sensory feedback is a common complaint. Recently, several groups have provided tactile feedback in upper limb amputees using implanted electrodes, surface nerve stimulation, or sensory substitution. These approaches have led to improved function and prosthesis embodiment. Nevertheless, the provided information remains limited to a subset of the rich sensory cues available to healthy individuals. More specifically, proprioception, the sense of limb position and movement, is predominantly absent from current systems. Here, we show that sensory substitution based on intraneural stimulation can deliver position feedback in real time and in conjunction with somatotopic tactile feedback. This approach allowed two transradial amputees to regain high and close-to-natural remapped proprioceptive acuity, with a median joint angle reproduction precision of 9.1° and a median threshold to detection of passive movements of 9.5°, which was comparable with results obtained in healthy participants. The simultaneous delivery of position information and somatotopic tactile feedback allowed both amputees to discriminate the size and compliance of four objects with high levels of performance (75.5%). These results demonstrate that tactile information delivered via somatotopic neural stimulation and position information delivered via sensory substitution can be exploited simultaneously and efficiently by transradial amputees. This study paves a way to more sophisticated bidirectional bionic limbs conveying richer, multimodal sensations.


2018 ◽  
Vol 9 (1) ◽  
pp. 221-234 ◽  
Author(s):  
João Avelino ◽  
Tiago Paulino ◽  
Carlos Cardoso ◽  
Ricardo Nunes ◽  
Plinio Moreno ◽  
...  

Abstract Handshaking is a fundamental part of human physical interaction that is transversal to various cultural backgrounds. It is also a very challenging task in the field of Physical Human-Robot Interaction (pHRI), requiring compliant force control in order to plan the arm’s motion and for a confident, but at the same time pleasant grasp of the human user’s hand. In this paper,we focus on the study of the hand grip strength for comfortable handshakes and perform three sets of physical interaction experiments between twenty human subjects in the first experiment, thirty-five human subjects in the second one, and thirty-eight human subjects in the third one. Tests are made with a social robot whose hands are instrumented with tactile sensors that provide skin-like sensation. From these experiments, we: (i) learn the preferred grip closure according to each user group; (ii) analyze the tactile feedback provided by the sensors for each closure; (iii) develop and evaluate the hand grip controller based on previous data. In addition to the robot-human interactions, we also learn about the robot executed handshake interactions with inanimate objects, in order to detect if it is shaking hands with a human or an inanimate object. This work adds physical human-robot interaction to the repertory of social skills of our robot, fulfilling a demand previously identified by many users of the robot.


Sign in / Sign up

Export Citation Format

Share Document