1A2-L09 Haptic object recognition by a robot hand covered with soft skin with tactile sensors

2009 ◽  
Vol 2009 (0) ◽  
pp. _1A2-L09_1-_1A2-L09_4
Author(s):  
Takeshi ANMA ◽  
Koh HOSODA
Author(s):  
Shun OGASA ◽  
Shu MORIKUNI ◽  
Satoshi FUNABASHI ◽  
Alexander SCHMITZ ◽  
Tito Pradhono TOMO ◽  
...  

2020 ◽  
Vol 5 (49) ◽  
pp. eabc8134
Author(s):  
Guozhen Li ◽  
Shiqiang Liu ◽  
Liangqi Wang ◽  
Rong Zhu

Robot hands with tactile perception can improve the safety of object manipulation and also improve the accuracy of object identification. Here, we report the integration of quadruple tactile sensors onto a robot hand to enable precise object recognition through grasping. Our quadruple tactile sensor consists of a skin-inspired multilayer microstructure. It works as thermoreceptor with the ability to perceive thermal conductivity of a material, measure contact pressure, as well as sense object temperature and environment temperature simultaneously and independently. By combining tactile sensing information and machine learning, our smart hand has the capability to precisely recognize different shapes, sizes, and materials in a diverse set of objects. We further apply our smart hand to the task of garbage sorting and demonstrate a classification accuracy of 94% in recognizing seven types of garbage.


Author(s):  
Satoshi Funabashi ◽  
Tomoki Isobe ◽  
Shun Ogasa ◽  
Tetsuya Ogata ◽  
Alexander Schmitz ◽  
...  
Keyword(s):  
Low Cost ◽  

Author(s):  
S. Unsal ◽  
A. Shirkhodaie ◽  
A. H. Soni

Abstract Adding sensing capability to a robot provides the robot with intelligent perception capability and flexibility of decision making. To perform intelligent tasks, robots are highly required to perceive their operating environment, and react accordingly. With this regard, tactile sensors offer to extend the scope of intelligence of a robot for performing tasks which require object touching, recognition, and manipulation. This paper presents the design of an inexpensive pneumatic binary-array tactile sensor for such robotic applications. The paper describes some of the techniques implemented for object recognition from binary sensory information. Furthermore, it details the development of software and hardware which facilitate the sensor to provide useful information to a robot so that the robot perceives its operating environment during manipulation of objects.


Sensors ◽  
2014 ◽  
Vol 14 (2) ◽  
pp. 3227-3266 ◽  
Author(s):  
Achint Aggarwal ◽  
Frank Kirchner

2012 ◽  
Vol 25 (0) ◽  
pp. 144
Author(s):  
Rebecca Lawson ◽  
Lauren Edwards ◽  
Amy Boylan

As we explore objects by touch we usually look towards our hands. Active touch (haptics) may therefore benefit from the simultaneous availability of visual information about the object that we are feeling and the alignment of spatial frames of reference centred on our head, eye and hand. If haptic processing usually uses visual and spatial inputs then even task-irrelevant visual and spatial manipulations may influence haptic shape identification. Scocchia et al. (2009) found that recognition of raised line pictures of familiar objects was better if people looked towards the pictures as they felt them although people were blindfolded so could not see their hand or the picture. We replicated their finding for 2D pictures and extended it to 3D, small-scale models of familiar objects. We also tested people’s speeded naming of real, familiar objects using their right hand. Performance was better when people looked towards the objects. In contrast, the position of the left hand did not influence haptic naming. Thus the spatial reference frame defined by the eyes/head influenced haptic shape processing but not that defined by an inactive hand. Furthermore, performance was the same whether people wore a mask and had their eyes closed, wore a mask but had their eyes open or looked through a narrow tube so could see a small area of their environment but not their hand or the object. Thus where people looked had a small but reliable effect on haptic object recognition but not what task-irrelevant information they could see.


2012 ◽  
Vol 25 (0) ◽  
pp. 122
Author(s):  
Michael Barnett-Cowan ◽  
Jody C. Culham ◽  
Jacqueline C. Snow

The orientation at which objects are most easily recognized — the perceptual upright (PU) — is influenced by body orientation with respect to gravity. To date, the influence of these cues on object recognition has only been measured within the visual system. Here we investigate whether objects explored through touch alone are similarly influenced by body and gravitational information. Using the Oriented CHAracter Recognition Test (OCHART) adapted for haptics, blindfolded right-handed observers indicated whether the symbol ‘p’ presented in various orientations was the letter ‘p’ or ‘d’ following active touch. The average of ‘p-to-d’ and ‘d-to-p’ transitions was taken as the haptic PU. Sensory information was manipulated by positioning observers in different orientations relative to gravity with the head, body, and hand aligned. Results show that haptic object recognition is equally influenced by body and gravitational references frames, but with a constant leftward bias. This leftward bias in the haptic PU resembles leftward biases reported for visual object recognition. The influence of body orientation and gravity on the haptic PU was well predicted by an equally weighted vectorial sum of the directions indicated by these cues. Our results demonstrate that information from different reference frames influence the perceptual upright in haptic object recognition. Taken together with similar investigations in vision, our findings suggest that reliance on body and gravitational frames of reference helps maintain optimal object recognition. Equally relying on body and gravitational information may facilitate haptic exploration with an upright posture, while compensating for poor vestibular sensitivity when tilted.


Sign in / Sign up

Export Citation Format

Share Document