scholarly journals Tactile Sensor Data Interpretation for Estimation of Wire Features

Electronics ◽  
2021 ◽  
Vol 10 (12) ◽  
pp. 1458
Author(s):  
Andrea Cirillo ◽  
Gianluca Laudante ◽  
Salvatore Pirozzi

At present, the tactile perception is essential for robotic applications when performing complex manipulation tasks, e.g., grasping objects of different shapes and sizes, distinguishing between different textures, and avoiding slips by grasping an object with a minimal force. Considering Deformable Linear Object manipulation applications, this paper presents an efficient and straightforward method to allow robots to autonomously work with thin objects, e.g., wires, and to recognize their features, i.e., diameter, by relying on tactile sensors developed by the authors. The method, based on machine learning algorithms, is described in-depth in the paper to make it easily reproducible by the readers. Experimental tests show the effectiveness of the approach that is able to properly recognize the considered object’s features with a recognition rate up to 99.9%. Moreover, a pick and place task, which uses the method to classify and organize a set of wires by diameter, is presented.


2020 ◽  
Vol 17 (4) ◽  
pp. 172988142093232
Author(s):  
Bing Zhang ◽  
Bowen Wang ◽  
Yunkai Li ◽  
Shaowei Jin

Tactile information is valuable in determining properties of objects that are inaccessible from visual perception. A new type of tangential friction and normal contact force magnetostrictive tactile sensor was developed based on the inverse magnetostrictive effect, and the force output model has been established. It can measure the exerted force in the range of 0–4 N, and it has a good response to the dynamic force in cycles of 0.25–0.5 s. We present a tactile perception strategy that a manipulator with tactile sensors in its grippers manipulates an object to measure a set of tactile features. It shows that tactile sensing system can use these features and the extreme learning machine algorithm to recognize household objects—purely from tactile sensing—from a small training set. The complex matrixes show the recognition rate is up to 83%.



Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1537
Author(s):  
Xingxing Zhang ◽  
Shaobo Li ◽  
Jing Yang ◽  
Qiang Bai ◽  
Yang Wang ◽  
...  

In order to improve the accuracy of manipulator operation, it is necessary to install a tactile sensor on the manipulator to obtain tactile information and accurately classify a target. However, with the increase in the uncertainty and complexity of tactile sensing data characteristics, and the continuous development of tactile sensors, typical machine-learning algorithms often cannot solve the problem of target classification of pure tactile data. Here, we propose a new model by combining a convolutional neural network and a residual network, named ResNet10-v1. We optimized the convolutional kernel, hyperparameters, and loss function of the model, and further improved the accuracy of target classification through the K-means clustering method. We verified the feasibility and effectiveness of the proposed method through a large number of experiments. We expect to further improve the generalization ability of this method and provide an important reference for the research in the field of tactile perception classification.



2014 ◽  
Vol 26 (6) ◽  
pp. 743-749 ◽  
Author(s):  
Yuki Mori ◽  
◽  
Ryojun Ikeura ◽  
Ming Ding ◽  
◽  
...  

<div class=""abs_img""><img src=""[disp_template_path]/JRM/abst-image/00260006/07.jpg"" width=""300"" />Position estimation by forearms</div> For a robot that uses two arms to lift and transfer a care receiver from a bed to a wheelchair, we report a method of estimating the positioning of the care receiver. The maneuver for such a task involves a high DOF, and the robot is capable of executing the maneuver much like a human being. The care receiver may experience pain or become unstable when being carried, however, depending on the positioning of contact between the robot’s arms and the care receiver. For this reason, nursing care robots must be able to recognize the positioning of contact with the care receiver and either modify it or alert the operator if it is unsuitable. We use the information obtained by tactile sensors on the robot’s arms when making contact with the care receiver to estimate the latter’s positioning. By dividing a care receiver’s position on a bed into nine zones and applying machine learning to tactile sensor data and positioning, it is possible to estimate positioning highly accurately. </span>



Sensors ◽  
2021 ◽  
Vol 21 (5) ◽  
pp. 1572
Author(s):  
Lukas Merker ◽  
Joachim Steigenberger ◽  
Rafael Marangoni ◽  
Carsten Behn

Just as the sense of touch complements vision in various species, several robots could benefit from advanced tactile sensors, in particular when operating under poor visibility. A prominent tactile sense organ, frequently serving as a natural paragon for developing tactile sensors, is the vibrissae of, e.g., rats. Within this study, we present a vibrissa-inspired sensor concept for 3D object scanning and reconstruction to be exemplarily used in mobile robots. The setup consists of a highly flexible rod attached to a 3D force-torque transducer (measuring device). The scanning process is realized by translationally shifting the base of the rod relative to the object. Consequently, the rod sweeps over the object’s surface, undergoing large bending deflections. Then, the support reactions at the base of the rod are evaluated for contact localization. Presenting a method of theoretically generating these support reactions, we provide an important basis for future parameter studies. During scanning, lateral slip of the rod is not actively prevented, in contrast to literature. In this way, we demonstrate the suitability of the sensor for passively dragging it on a mobile robot. Experimental scanning sweeps using an artificial vibrissa (steel wire) of length 50 mm and a glass sphere as a test object with a diameter of 60 mm verify the theoretical results and serve as a proof of concept.



Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 405
Author(s):  
Marcos Lupión ◽  
Javier Medina-Quero ◽  
Juan F. Sanjuan ◽  
Pilar M. Ortigosa

Activity Recognition (AR) is an active research topic focused on detecting human actions and behaviours in smart environments. In this work, we present the on-line activity recognition platform DOLARS (Distributed On-line Activity Recognition System) where data from heterogeneous sensors are evaluated in real time, including binary, wearable and location sensors. Different descriptors and metrics from the heterogeneous sensor data are integrated in a common feature vector whose extraction is developed by a sliding window approach under real-time conditions. DOLARS provides a distributed architecture where: (i) stages for processing data in AR are deployed in distributed nodes, (ii) temporal cache modules compute metrics which aggregate sensor data for computing feature vectors in an efficient way; (iii) publish-subscribe models are integrated both to spread data from sensors and orchestrate the nodes (communication and replication) for computing AR and (iv) machine learning algorithms are used to classify and recognize the activities. A successful case study of daily activities recognition developed in the Smart Lab of The University of Almería (UAL) is presented in this paper. Results present an encouraging performance in recognition of sequences of activities and show the need for distributed architectures to achieve real time recognition.



Sensors ◽  
2021 ◽  
Vol 21 (13) ◽  
pp. 4324
Author(s):  
Moaed A. Abd ◽  
Rudy Paul ◽  
Aparna Aravelli ◽  
Ou Bai ◽  
Leonel Lagos ◽  
...  

Multifunctional flexible tactile sensors could be useful to improve the control of prosthetic hands. To that end, highly stretchable liquid metal tactile sensors (LMS) were designed, manufactured via photolithography, and incorporated into the fingertips of a prosthetic hand. Three novel contributions were made with the LMS. First, individual fingertips were used to distinguish between different speeds of sliding contact with different surfaces. Second, differences in surface textures were reliably detected during sliding contact. Third, the capacity for hierarchical tactile sensor integration was demonstrated by using four LMS signals simultaneously to distinguish between ten complex multi-textured surfaces. Four different machine learning algorithms were compared for their successful classification capabilities: K-nearest neighbor (KNN), support vector machine (SVM), random forest (RF), and neural network (NN). The time-frequency features of the LMSs were extracted to train and test the machine learning algorithms. The NN generally performed the best at the speed and texture detection with a single finger and had a 99.2 ± 0.8% accuracy to distinguish between ten different multi-textured surfaces using four LMSs from four fingers simultaneously. The capability for hierarchical multi-finger tactile sensation integration could be useful to provide a higher level of intelligence for artificial hands.



2021 ◽  
Vol 6 (51) ◽  
pp. eabc8801
Author(s):  
Youcan Yan ◽  
Zhe Hu ◽  
Zhengbao Yang ◽  
Wenzhen Yuan ◽  
Chaoyang Song ◽  
...  

Human skin can sense subtle changes of both normal and shear forces (i.e., self-decoupled) and perceive stimuli with finer resolution than the average spacing between mechanoreceptors (i.e., super-resolved). By contrast, existing tactile sensors for robotic applications are inferior, lacking accurate force decoupling and proper spatial resolution at the same time. Here, we present a soft tactile sensor with self-decoupling and super-resolution abilities by designing a sinusoidally magnetized flexible film (with the thickness ~0.5 millimeters), whose deformation can be detected by a Hall sensor according to the change of magnetic flux densities under external forces. The sensor can accurately measure the normal force and the shear force (demonstrated in one dimension) with a single unit and achieve a 60-fold super-resolved accuracy enhanced by deep learning. By mounting our sensor at the fingertip of a robotic gripper, we show that robots can accomplish challenging tasks such as stably grasping fragile objects under external disturbance and threading a needle via teleoperation. This research provides new insight into tactile sensor design and could be beneficial to various applications in robotics field, such as adaptive grasping, dexterous manipulation, and human-robot interaction.



2011 ◽  
Vol 08 (03) ◽  
pp. 181-195
Author(s):  
ZHAOXIAN XIE ◽  
HISASHI YAMAGUCHI ◽  
MASAHITO TSUKANO ◽  
AIGUO MING ◽  
MAKOTO SHIMOJO

As one of the home services by a mobile manipulator system, we are aiming at the realization of the stand-up motion support for elderly people. This work is charaterized by the use of real-time feedback control based on the information from high speed tactile sensors for detecting the contact force as well as its center of pressure between the assisted human and the robot arm. First, this paper introduces the design of the tactile sensor as well as initial experimental results to show the feasibility of the proposed system. Moreover, several fundamental tactile sensing-based motion controllers necessary for the stand-up motion support and their experimental verification are presented. Finally, an assist trajectory generation method for the stand-up motion support by integrating fuzzy logic with tactile sensing is proposed and demonstrated experimentally.



Author(s):  
S. Unsal ◽  
A. Shirkhodaie ◽  
A. H. Soni

Abstract Adding sensing capability to a robot provides the robot with intelligent perception capability and flexibility of decision making. To perform intelligent tasks, robots are highly required to perceive their operating environment, and react accordingly. With this regard, tactile sensors offer to extend the scope of intelligence of a robot for performing tasks which require object touching, recognition, and manipulation. This paper presents the design of an inexpensive pneumatic binary-array tactile sensor for such robotic applications. The paper describes some of the techniques implemented for object recognition from binary sensory information. Furthermore, it details the development of software and hardware which facilitate the sensor to provide useful information to a robot so that the robot perceives its operating environment during manipulation of objects.



Sign in / Sign up

Export Citation Format

Share Document