scholarly journals Haptic object recognition based on shape relates to visual object recognition ability

Author(s):  
Jason K. Chow ◽  
Thomas J. Palmeri ◽  
Isabel Gauthier
2012 ◽  
Vol 25 (0) ◽  
pp. 122
Author(s):  
Michael Barnett-Cowan ◽  
Jody C. Culham ◽  
Jacqueline C. Snow

The orientation at which objects are most easily recognized — the perceptual upright (PU) — is influenced by body orientation with respect to gravity. To date, the influence of these cues on object recognition has only been measured within the visual system. Here we investigate whether objects explored through touch alone are similarly influenced by body and gravitational information. Using the Oriented CHAracter Recognition Test (OCHART) adapted for haptics, blindfolded right-handed observers indicated whether the symbol ‘p’ presented in various orientations was the letter ‘p’ or ‘d’ following active touch. The average of ‘p-to-d’ and ‘d-to-p’ transitions was taken as the haptic PU. Sensory information was manipulated by positioning observers in different orientations relative to gravity with the head, body, and hand aligned. Results show that haptic object recognition is equally influenced by body and gravitational references frames, but with a constant leftward bias. This leftward bias in the haptic PU resembles leftward biases reported for visual object recognition. The influence of body orientation and gravity on the haptic PU was well predicted by an equally weighted vectorial sum of the directions indicated by these cues. Our results demonstrate that information from different reference frames influence the perceptual upright in haptic object recognition. Taken together with similar investigations in vision, our findings suggest that reliance on body and gravitational frames of reference helps maintain optimal object recognition. Equally relying on body and gravitational information may facilitate haptic exploration with an upright posture, while compensating for poor vestibular sensitivity when tilted.


Perception ◽  
10.1068/p6038 ◽  
2008 ◽  
Vol 37 (12) ◽  
pp. 1867-1878 ◽  
Author(s):  
Andrew T Woods ◽  
Allison Moore ◽  
Fiona N Newell

Previous investigations of visual object recognition have found that some views of both familiar and unfamiliar objects promote more efficient recognition performance than other views. These views are considered as canonical and are often the views that present the most information about an object's 3-D structure and features in the image. Although objects can also be efficiently recognised with touch alone, little is known whether some views promote more efficient recognition than others. This may seem unlikely, given that the object structure and features are readily available to the hand during object exploration. We conducted two experiments to investigate whether canonical views existed in haptic object recognition. In the first, participants were required to position each object in a way that would present the best view for learning the object with touch alone. We found a large degree of consistency of viewpoint position across participants for both familiar and unfamiliar objects. In a second experiment, we found that these consistent, or canonical, views promoted better haptic recognition performance than other random views of the objects. Interestingly, these haptic canonical views were not necessarily the same as the canonical views normally found in visual perception. Nevertheless, our findings provide support for the idea that both the visual and the tactile systems are functionally equivalent in terms of how objects are represented in memory and subsequently recognised.


2020 ◽  
Vol 34 (6) ◽  
pp. 1369-1378
Author(s):  
Ann J. Carrigan ◽  
Paul Stoodley ◽  
Fernando Fernandez ◽  
Mackenzie A. Sunday ◽  
Mark W. Wiggins

2020 ◽  
Vol 20 (11) ◽  
pp. 139
Author(s):  
Ann Carrigan ◽  
Paul Stoodley ◽  
Fernando Fernandez ◽  
Mackenzie Sunday ◽  
Mark Wiggins

2007 ◽  
Author(s):  
K. Suzanne Scherf ◽  
Marlene Behrmann ◽  
Kate Humphreys ◽  
Beatriz Luna

Sign in / Sign up

Export Citation Format

Share Document