Biomimetic Tactile Sensor

Author(s):  
Nicholas Wettels ◽  
Djordje Popovic ◽  
Gerald E. Loeb

The performance of prosthetic hands and robotic manipulators is severely limited by their having little or no tactile information compared to the human hand. Technologies such as MEMS, microfluidics, and nanoparticles have been used to produce arrays of force sensors, but these are generally not robust enough to mount on curved, deformable finger pads or to use in environments that include dust, fluids, sharp edges and wide temperature swings. Furthermore, it is not clear how the prosthetic controller will use the tactile information, so it is difficult to generate specifications for these sensors.

Author(s):  
Wataru Fukui ◽  
Futoshi Kobayashi ◽  
Fumio Kojima ◽  
Hiroyuki Nakamoto ◽  
Tadashi Maeda ◽  
...  

1995 ◽  
Vol 200 (1) ◽  
pp. 25-28 ◽  
Author(s):  
Alfons Schnitzler ◽  
Riitta Salmelin ◽  
Stephan Salenius ◽  
Veikko Jousmäki ◽  
Riitta Hari

2019 ◽  
Vol 5 (1) ◽  
pp. 207-210
Author(s):  
Tolgay Kara ◽  
Ahmad Soliman Masri

AbstractMillions of people around the world have lost their upper limbs mainly due to accidents and wars. Recently in the Middle East, the demand for prosthetic limbs has increased dramatically due to ongoing wars in the region. Commercially available prosthetic limbs are expensive while the most economical method available for controlling prosthetic limbs is the Electromyography (EMG). Researchers on EMG-controlled prosthetic limbs are facing several challenges, which include efficiency problems in terms of functionality especially in prosthetic hands. A major issue that needs to be solved is the fact that currently available low-cost EMG-controlled prosthetic hands cannot enable the user to grasp various types of objects in various shapes, and cannot provide the efficient use of the object by deciding the necessary hand gesture. In this paper, a computer vision-based mechanism is proposed with the purpose of detecting and recognizing objects and applying optimal hand gesture through visual feedback. The objects are classified into groups and the optimal hand gesture to grasp and use the targeted object that is most efficient for the user is implemented. A simulation model of the human hand kinematics is developed for simulation tests to reveal the efficacy of the proposed method. 80 different types of objects are detected, recognized, and classified for simulation tests, which can be realized by using two electrodes supplying the input to perform the action. Simulation results reveal the performance of proposed EMG-controlled prosthetic hand in maintaining optimal hand gestures in computer environment. Results are promising to help disabled people handle and use objects more efficiently without higher costs.


2020 ◽  
Vol 17 (4) ◽  
pp. 172988142093232
Author(s):  
Bing Zhang ◽  
Bowen Wang ◽  
Yunkai Li ◽  
Shaowei Jin

Tactile information is valuable in determining properties of objects that are inaccessible from visual perception. A new type of tangential friction and normal contact force magnetostrictive tactile sensor was developed based on the inverse magnetostrictive effect, and the force output model has been established. It can measure the exerted force in the range of 0–4 N, and it has a good response to the dynamic force in cycles of 0.25–0.5 s. We present a tactile perception strategy that a manipulator with tactile sensors in its grippers manipulates an object to measure a set of tactile features. It shows that tactile sensing system can use these features and the extreme learning machine algorithm to recognize household objects—purely from tactile sensing—from a small training set. The complex matrixes show the recognition rate is up to 83%.


Sensors ◽  
2019 ◽  
Vol 19 (3) ◽  
pp. 523 ◽  
Author(s):  
Brayan Zapata-Impata ◽  
Pablo Gil ◽  
Fernando Torres

Robotic manipulators have to constantly deal with the complex task of detecting whether a grasp is stable or, in contrast, whether the grasped object is slipping. Recognising the type of slippage—translational, rotational—and its direction is more challenging than detecting only stability, but is simultaneously of greater use as regards correcting the aforementioned grasping issues. In this work, we propose a learning methodology for detecting the direction of a slip (seven categories) using spatio-temporal tactile features learnt from one tactile sensor. Tactile readings are, therefore, pre-processed and fed to a ConvLSTM that learns to detect these directions with just 50 ms of data. We have extensively evaluated the performance of the system and have achieved relatively high results at the detection of the direction of slip on unseen objects with familiar properties (82.56% accuracy).


2020 ◽  
Vol 6 (16) ◽  
pp. eaaz1158 ◽  
Author(s):  
Yitian Shao ◽  
Vincent Hayward ◽  
Yon Visell

A key problem in the study of the senses is to describe how sense organs extract perceptual information from the physics of the environment. We previously observed that dynamic touch elicits mechanical waves that propagate throughout the hand. Here, we show that these waves produce an efficient encoding of tactile information. The computation of an optimal encoding of thousands of naturally occurring tactile stimuli yielded a compact lexicon of primitive wave patterns that sparsely represented the entire dataset, enabling touch interactions to be classified with an accuracy exceeding 95%. The primitive tactile patterns reflected the interplay of hand anatomy with wave physics. Notably, similar patterns emerged when we applied efficient encoding criteria to spiking data from populations of simulated tactile afferents. This finding suggests that the biomechanics of the hand enables efficient perceptual processing by effecting a preneuronal compression of tactile information.


2007 ◽  
Vol 19 (1) ◽  
pp. 42-51 ◽  
Author(s):  
Tomoyuki Noda ◽  
◽  
Takahiro Miyashita ◽  
Hiroshi Ishiguro ◽  
Kiyoshi Kogure ◽  
...  

To extract information about users contacting robots physically, the distribution density of tactile sensor elements, the sampling rate, and the resolution all must be high, increasing the volume of tactile information. In the self-organized skin sensor network we propose for dealing with a large number of tactile sensors embedded throughout a humanoid robot, each network node having a processing unit is connected to tactile sensor elements and other nodes. By processing tactile information in the network based on the situation, individual nodes process and reduce information rapidly in high sampling. They also secure information transmission routes to the host PC using a data transmission protocol for self-organizing sensor networks. In this paper, we verify effectiveness of our proposal through sensor network emulation and basic experiments in spatiotemporal calculation of tactile information using prototype hardware. As an emulation result of the self-organized sensor network, routes to the host PC are secured at each node, and a tree-like network is constructed recursively with the node as a root. As the basic experiments, we describe an edge detection as data processing and extraction for haptic interaction. In conclusion, local information processing is effective for detecting features of haptic interaction.


2006 ◽  
Vol 326-328 ◽  
pp. 1343-1346
Author(s):  
Jin Seok Heo ◽  
Jong Ha Cheung ◽  
Jung Ju Lee

In this paper, we present a newly designed flexible optical fiber force sensors which use fiber Bragg gratings and diaphragm and bridge type transducer, to detect a distributed normal force and which is the first step toward realizing a tactile sensor using optical fiber sensors (FBG). The transducer is designed such that it is not affected by chirping and light loss to enhance the performance of the sensors. We also present the design and fabrication process and experimental verification of the prototype sensors.


2020 ◽  
Author(s):  
Gang Liu ◽  
Lu Wang ◽  
Jing Wang

Myoelectric prosthetic hands create the possibility for amputees to control their prosthetics like native hands. However, user acceptance of the extant myoelectric prostheses is low. Unnatural control, lack of sufficient feedback, and insufficient functionality are cited as primary reasons. Recently, although many multiple degrees-of-freedom (DOF) prosthetic hands and tactile-sensitive electronic skins have been developed, no non-invasive myoelectric interfaces can decode both forces and motions for five-fingers independently and simultaneously. This paper proposes a myoelectric interface based on energy allocation and fictitious forces hypothesis by mimicking the natural neuromuscular system. The energy-based interface uses a kind of continuous “energy mode” in the level of the entire hand. According to tasks itself, each energy mode can adaptively and simultaneously implement multiple hand motions and exerting continuous forces for a single finger. Also, a few learned energy modes could extend to the unlearned energy mode, highlighting the extensibility of this interface. We evaluate the proposed system through off-line analysis and operational experiments performed on the expression of the unlearned hand motions, the amount of finger energy, and real-time control. With active exploration, the participant was proficient at exerting just enough energy to five fingers on “fragile” or “heavy” objects independently, proportionally, and simultaneously in real-time. The main contribution of this paper is proposing the bionic energy-motion model of hand: decoding a few muscle-energy modes of the human hand (only ten modes in this paper) map massive tasks of bionic hand.


Sign in / Sign up

Export Citation Format

Share Document