haptic exploration
Recently Published Documents


TOTAL DOCUMENTS

156
(FIVE YEARS 36)

H-INDEX

24
(FIVE YEARS 3)

2021 ◽  
Author(s):  
Tae Myung Huh ◽  
Kate Sanders ◽  
Michael Danielczuk ◽  
Monica Li ◽  
Yunliang Chen ◽  
...  

2021 ◽  
Author(s):  
Hannah Elbaggari ◽  
Rubia Guerra ◽  
Sabrina Knappe ◽  
Juliette Regimbal
Keyword(s):  

2021 ◽  
Author(s):  
Alexandra Moringen ◽  
Sascha Fleer ◽  
Helge Ritter

2021 ◽  
Author(s):  
Keyhan Kouhkiloui Babarahmati ◽  
Carlo Tiseo ◽  
Quentin Rouxel ◽  
Zhibin Li ◽  
Michael Mistry

Author(s):  
María Silva‐Gago ◽  
Annapaola Fedato ◽  
Marcos Terradillos‐Bernal ◽  
Rodrigo Alonso‐Alcalde ◽  
Elena Martín‐Guerra ◽  
...  

2021 ◽  
Vol 15 ◽  
Author(s):  
Xiaogang Yan ◽  
Steven Mills ◽  
Alistair Knott

Humans initially learn about objects through the sense of touch, in a process called “haptic exploration.” In this paper, we present a neural network model of this learning process. The model implements two key assumptions. The first is that haptic exploration can be thought of as a type of navigation, where the exploring hand plays the role of an autonomous agent, and the explored object is this agent's “local environment.” In this scheme, the agent's movements are registered in the coordinate system of the hand, through slip sensors on the palm and fingers. Our second assumption is that the learning process rests heavily on a simple model of sequence learning, where frequently-encountered sequences of hand movements are encoded declaratively, as “chunks.” The geometry of the object being explored places constraints on possible movement sequences: our proposal is that representations of possible, or frequently-attested sequences implicitly encode the shape of the explored object, along with its haptic affordances. We evaluate our model in two ways. We assess how much information about the hand's actual location is conveyed by its internal representations of movement sequences. We also assess how effective the model's representations are in a reinforcement learning task, where the agent must learn how to reach a given location on an explored object. Both metrics validate the basic claims of the model. We also show that the model learns better if objects are asymmetrical, or contain tactile landmarks, or if the navigating hand is articulated, which further constrains the movement sequences supported by the explored object.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
C. Landelle ◽  
J. Danna ◽  
B. Nazarian ◽  
M. Amberg ◽  
F. Giraud ◽  
...  

AbstractCombining multisensory sources is crucial to interact with our environment, especially for older people who are facing sensory declines. Here, we examined the influence of textured sounds on haptic exploration of artificial textures in healthy younger and older adults by combining a tactile device (ultrasonic display) with synthetized textured sounds. Participants had to discriminate simulated textures with their right index while they were distracted by three disturbing, more or less textured sounds. These sounds were presented as a real-time auditory feedback based on finger movement sonification and thus gave the sensation that the sounds were produced by the haptic exploration. Finger movement velocity increased across both groups in presence of textured sounds (Rubbing or Squeaking) compared to a non-textured (Neutral) sound. While young adults had the same discrimination threshold, regardless of the sound added, the older adults were more disturbed by the presence of the textured sounds with respect to the Neutral sound. Overall, these findings suggest that irrelevant auditory information was taken into account by all participants, but was appropriately segregated from tactile information by young adults. Older adults failed to segregate auditory information, supporting the hypothesis of general facilitation of multisensory integration with aging.


2021 ◽  
Author(s):  
Anna Metzger ◽  
Matteo Toscani

AbstractWhen touching the surface of an object, its spatial structure translates into a vibration on the skin. The perceptual system evolved to translate this pattern into a representation that allows to distinguish between different materials. Here we show that perceptual haptic representation of materials emerges from efficient encoding of vibratory patterns elicited by the interaction with materials. We trained a deep neural network with unsupervised learning (Autoencoder) to reconstruct vibratory patterns elicited by human haptic exploration of different materials. The learned compressed representation (i.e. latent space) allows for classification of material categories (i.e. plastic, stone, wood, fabric, leather/wool, paper, and metal). More importantly, distances between these categories in the latent space resemble perceptual distances, suggesting a similar coding. We could further show, that the temporal tuning of the emergent latent dimensions is similar to properties of human tactile receptors.


Sign in / Sign up

Export Citation Format

Share Document