skin deformation
Recently Published Documents


TOTAL DOCUMENTS

167
(FIVE YEARS 45)

H-INDEX

19
(FIVE YEARS 2)

2022 ◽  
Vol 29 (3) ◽  
pp. 1-34
Author(s):  
Moritz Alexander Messerschmidt ◽  
Sachith Muthukumarana ◽  
Nur Al-Huda Hamdan ◽  
Adrian Wagner ◽  
Haimo Zhang ◽  
...  

We present ANISMA, a software and hardware toolkit to prototype on-skin haptic devices that generate skin deformation stimuli like pressure, stretch, and motion using shape-memory alloys (SMAs). Our toolkit embeds expert knowledge that makes SMA spring actuators more accessible to human–computer interaction (HCI) researchers. Using our software tool, users can design different actuator layouts, program their spatio-temporal actuation and preview the resulting deformation behavior to verify a design at an early stage. Our toolkit allows exporting the actuator layout and 3D printing it directly on skin adhesive. To test different actuation sequences on the skin, a user can connect the SMA actuators to our customized driver board and reprogram them using our visual programming interface. We report a technical analysis, verify the perceptibility of essential ANISMA skin deformation devices with 8 participants, and evaluate ANISMA regarding its usability and supported creativity with 12 HCI researchers in a creative design task.


Author(s):  
Simone G.V.S. Smith ◽  
Maiya K. Yokich ◽  
Shawn M. Beaudette ◽  
Stephen H. M. Brown ◽  
Leah R. Bent

Understanding the processing of tactile information is crucial for the development of biofeedback interventions that target cutaneous mechanoreceptors. Mechanics of the skin have been shown to influence cutaneous tactile sensitivity. It has been established that foot skin mechanics are altered due to foot posture, but whether these changes affect cutaneous sensitivity are unknown. The purpose of this study was to investigate the potential effect of posture-mediated skin deformation about the ankle joint on perceptual measures of foot skin sensitivity. Participants (N = 20) underwent perceptual skin sensitivity testing on either the foot sole (N = 10) or dorsum (N = 10) with the foot positioned in maximal dorsiflexion/toe extension, maximal plantarflexion/toe flexion, and a neutral foot posture. Perceptual tests included touch sensitivity, stretch sensitivity, and spatial acuity. Regional differences in touch sensitivity were found across the foot sole (p < 0.001) and dorsum (p < 0.001). Touch sensitivity also significantly increased in postures where the skin was compressed (p = 0.001). Regional differences in spatial acuity were found on the foot sole (p = 0.002) but not dorsum (p = 0.666). Spatial acuity was not significantly altered by posture across the foot sole and dorsum, other than an increase in sensitivity at the medial arch in the dorsiflexion posture (p = 0.006). Posture*site interactions were found for stretch sensitivity on the foot sole and dorsum in both the transverse and longitudinal directions (p < 0.005). Stretch sensitivity increased in postures where the skin was pre-stretched on both the foot sole and dorsum. Changes in sensitivity across locations and postures were believed to occur due to concurrent changes in skin mechanics, such as skin hardness and thickness, which follows our previous findings. Future cutaneous biofeedback interventions should be applied with an awareness of these changes in skin sensitivity, to maximize their effectiveness for foot sole and dorsum input.


2021 ◽  
Vol 118 (49) ◽  
pp. e2109109118
Author(s):  
Laurence Willemet ◽  
Khoubeib Kanzari ◽  
Jocelyn Monnoyer ◽  
Ingvars Birznieks ◽  
Michaël Wiertlewski

Humans efficiently estimate the grip force necessary to lift a variety of objects, including slippery ones. The regulation of grip force starts with the initial contact and takes into account the surface properties, such as friction. This estimation of the frictional strength has been shown to depend critically on cutaneous information. However, the physical and perceptual mechanism that provides such early tactile information remains elusive. In this study, we developed a friction-modulation apparatus to elucidate the effects of the frictional properties of objects during initial contact. We found a correlation between participants’ conscious perception of friction and radial strain patterns of skin deformation. The results provide insights into the tactile cues made available by contact mechanics to the sensorimotor regulation of grip, as well as to the conscious perception of the frictional properties of an object.


2021 ◽  
Author(s):  
Sung-Gwi Cho ◽  
Mayuki Toyoda ◽  
Ming Ding ◽  
Jun Takamatsu ◽  
Chiaki Yokota ◽  
...  

2021 ◽  
Author(s):  
Laurence Willemet ◽  
Khoubeib Kanzari ◽  
Jocelyn Monnoyer ◽  
Ingvars Birznieks ◽  
Michael Wiertlewski

Humans efficiently estimate the grip force necessary to lift a variety of objects, including slippery ones. The regulation of grip force starts with the initial contact, and takes into account the surface properties, such as friction. This estimation of the frictional strength has been shown to depend critically on cutaneous information. However, the physical and perceptual mechanism that provides such early tactile information remains elusive. In this study, we developed a friction-modulation apparatus to elucidate the effects of the frictional properties of objects during initial contact. We found a correlation between participants' conscious perception of friction and radial strain patterns of skin deformation. The results provide insights into the tactile cues made available by contact mechanics to the sensorimotor regulation of grip, as well as to the conscious perception of the frictional properties of an object.


Author(s):  
Vimal Kakaraparthi ◽  
Qijia Shao ◽  
Charles J. Carver ◽  
Tien Pham ◽  
Nam Bui ◽  
...  

Face touch is an unconscious human habit. Frequent touching of sensitive/mucosal facial zones (eyes, nose, and mouth) increases health risks by passing pathogens into the body and spreading diseases. Furthermore, accurate monitoring of face touch is critical for behavioral intervention. Existing monitoring systems only capture objects approaching the face, rather than detecting actual touches. As such, these systems are prone to false positives upon hand or object movement in proximity to one's face (e.g., picking up a phone). We present FaceSense, an ear-worn system capable of identifying actual touches and differentiating them between sensitive/mucosal areas from other facial areas. Following a multimodal approach, FaceSense integrates low-resolution thermal images and physiological signals. Thermal sensors sense the thermal infrared signal emitted by an approaching hand, while physiological sensors monitor impedance changes caused by skin deformation during a touch. Processed thermal and physiological signals are fed into a deep learning model (TouchNet) to detect touches and identify the facial zone of the touch. We fabricated prototypes using off-the-shelf hardware and conducted experiments with 14 participants while they perform various daily activities (e.g., drinking, talking). Results show a macro-F1-score of 83.4% for touch detection with leave-one-user-out cross-validation and a macro-F1-score of 90.1% for touch zone identification with a personalized model.


2021 ◽  
Author(s):  
Satoshi Yagi

In this paper, we propose the concept of Android Printing, which is printing a full android, including skin and mechanical components in a single run using a multi-material 3-D printer. Printing an android all at once both reduces assembly time and enables intricate designs with a high degrees of freedom. To prove this concept, we tested by actual printing an android. First, we printed the skin with multiple annular ridges to test skin deformation. By pulling the skin, we show that the state of deformation of the skin can be adjusted depending on the ridge structure. This result is essential in designing humanlike skin deformations. After that, we designed and fabricated a 3-D printed android head with 31 degrees of freedom. The skin and linkage mechanism were printed together before connecting them to a unit combining several electric motors. To confirm our concept’s feasibility, we created several motions with the android based on human facial movement data. In the future, android printing might enable people to use an android as their own avatar.


2021 ◽  
Author(s):  
Satoshi Yagi

In this paper, we propose the concept of Android Printing, which is printing a full android, including skin and mechanical components in a single run using a multi-material 3-D printer. Printing an android all at once both reduces assembly time and enables intricate designs with a high degrees of freedom. To prove this concept, we tested by actual printing an android. First, we printed the skin with multiple annular ridges to test skin deformation. By pulling the skin, we show that the state of deformation of the skin can be adjusted depending on the ridge structure. This result is essential in designing humanlike skin deformations. After that, we designed and fabricated a 3-D printed android head with 31 degrees of freedom. The skin and linkage mechanism were printed together before connecting them to a unit combining several electric motors. To confirm our concept’s feasibility, we created several motions with the android based on human facial movement data. In the future, android printing might enable people to use an android as their own avatar.


Sign in / Sign up

Export Citation Format

Share Document