Design of a flexible tactile sensor for classification of rigid and deformable objects

2014 ◽  
Vol 62 (1) ◽  
pp. 3-15 ◽  
Author(s):  
Alin Drimus ◽  
Gert Kootstra ◽  
Arne Bilberg ◽  
Danica Kragic
2007 ◽  
Vol 19 (1) ◽  
pp. 85-96 ◽  
Author(s):  
Kenshi Watanabe ◽  
◽  
Kenichi Ohkubo ◽  
Sumiaki Ichikawa ◽  
Fumio Hara ◽  
...  

Our proposal involves classifying cylindrical objects by using soft tactile sensor arrays on a single five-link robotic finger. The front of each link is covered with semicircular silicone rubber with 235 small on-off switches. On-off data from switches obtained when an object is grasped is converted to a spatiotemporal matrix. Eight cells around the contact switch are useful in extracting local spatiotemporal contact physics, so the frequency of the 8-Cell patterns composed of binary data around the switch contacted is obtained for each object and used to form a contact-feature vector. This vector is obtained 10 times of experimental trial, corresponding to each object. Vectors are classified by the Mahalanobis distance for 12 objects - cylinders and regular polygonal prisms - resulting in 14 types of grasping (14 classes). Using 6 dimensional feature vectors, over 95% classification accuracy is obtained for 7 classes derived from 5 objects having one or two types of stable grasping.


Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1537
Author(s):  
Xingxing Zhang ◽  
Shaobo Li ◽  
Jing Yang ◽  
Qiang Bai ◽  
Yang Wang ◽  
...  

In order to improve the accuracy of manipulator operation, it is necessary to install a tactile sensor on the manipulator to obtain tactile information and accurately classify a target. However, with the increase in the uncertainty and complexity of tactile sensing data characteristics, and the continuous development of tactile sensors, typical machine-learning algorithms often cannot solve the problem of target classification of pure tactile data. Here, we propose a new model by combining a convolutional neural network and a residual network, named ResNet10-v1. We optimized the convolutional kernel, hyperparameters, and loss function of the model, and further improved the accuracy of target classification through the K-means clustering method. We verified the feasibility and effectiveness of the proposed method through a large number of experiments. We expect to further improve the generalization ability of this method and provide an important reference for the research in the field of tactile perception classification.


2020 ◽  
Vol 10 (12) ◽  
pp. 4088
Author(s):  
Andreas Verleysen ◽  
Thomas Holvoet ◽  
Remko Proesmans ◽  
Cedric Den Haese ◽  
Francis wyffels

Deformable objects such as ropes, wires, and clothing are omnipresent in society and industry but are little researched in robotics research. This is due to the infinite amount of possible state configurations caused by the deformations of the deformable object. Engineered approaches try to cope with this by implementing highly complex operations in order to estimate the state of the deformable object. This complexity can be circumvented by utilizing learning-based approaches, such as reinforcement learning, which can deal with the intrinsic high-dimensional state space of deformable objects. However, the reward function in reinforcement learning needs to measure the state configuration of the highly deformable object. Vision-based reward functions are difficult to implement, given the high dimensionality of the state and complex dynamic behavior. In this work, we propose the consideration of concepts beyond vision and incorporate other modalities which can be extracted from deformable objects. By integrating tactile sensor cells into a textile piece, proprioceptive capabilities are gained that are valuable as they provide a reward function to a reinforcement learning agent. We demonstrate on a low-cost dual robotic arm setup that a physical agent can learn on a single CPU core to fold a rectangular patch of textile in the real world based on a learned reward function from tactile information.


Sign in / Sign up

Export Citation Format

Share Document