2P1-S-066 Recognition of Contact State of Arrayed Type Tactile Sensor by using Neural Network(Evolution and Learning for Robotics 5,Mega-Integration in Robotics and Mechatronics to Assist Our Daily Lives)

Author(s):  
Takaaki TANAKA ◽  
Kenji MAKIHIRA ◽  
Seiji AOYAGI ◽  
Masaharu TAKANO
2021 ◽  
Vol 10 (4) ◽  
pp. 1-27
Author(s):  
Shengxin Jia ◽  
Veronica J. Santos

The sense of touch is essential for locating buried objects when vision-based approaches are limited. We present an approach for tactile perception when sensorized robot fingertips are used to directly interact with granular media particles in teleoperated systems. We evaluate the effects of linear and nonlinear classifier model architectures and three tactile sensor modalities (vibration, internal fluid pressure, fingerpad deformation) on the accuracy of estimates of fingertip contact state. We propose an architecture called the Sparse-Fusion Recurrent Neural Network (SF-RNN) in which sparse features are autonomously extracted prior to fusing multimodal tactile data in a fully connected RNN input layer. The multimodal SF-RNN model achieved 98.7% test accuracy and was robust to modest variations in granular media type and particle size, fingertip orientation, fingertip speed, and object location. Fingerpad deformation was the most informative modality for haptic exploration within granular media while vibration and internal fluid pressure provided additional information with appropriate signal processing. We introduce a real-time visualization of tactile percepts for remote exploration by constructing a belief map that combines probabilistic contact state estimates and fingertip location. The belief map visualizes the probability of an object being buried in the search region and could be used for planning.


Sign in / Sign up

Export Citation Format

Share Document