haptic interactions
Recently Published Documents


TOTAL DOCUMENTS

75
(FIVE YEARS 16)

H-INDEX

10
(FIVE YEARS 2)

2021 ◽  
Author(s):  
Soo Wan Chun ◽  
Jinsil Hwaryoung Seo ◽  
Caleb Kicklighter ◽  
Elizabeth Wells-Beede ◽  
Jack Greene ◽  
...  

2021 ◽  
Author(s):  
Asuka Takai ◽  
Qiushi Fu ◽  
Yuzuru Doibata ◽  
Giuseppe Lisi ◽  
Toshiki Tsuchiya ◽  
...  

Are leaders made or born? Leader-follower roles have been well characterized in social science, but they remain somewhat obscure in sensory-motor coordination. Furthermore, it is unknown how and why leader-follower relationships are acquired, including innate versus acquired controversies. We developed a novel asymmetrical coordination task in which two participants (dyad) need to collaborate in transporting a simulated beam while maintaining its horizontal attitude. This experimental paradigm was implemented by twin robotic manipulanda, simulated beam dynamics, haptic interactions, and a projection screen. Clear leader-follower relationships were learned despite participants not being informed that they were interacting with each other, but only when strong haptic feedback was introduced. For the first time, we demonstrate the emergence of consistent leader-follower relationships in sensory-motor coordination, and further show that haptic interaction is essential for dyadic co-adaptation. These results provide insights into neural mechanisms responsible for the formation of leader-follower relationships in our society.


2021 ◽  
Vol 21 (9) ◽  
pp. 2903
Author(s):  
Haeji Shin ◽  
Yuna Kwak ◽  
Chai-Youn Kim

2021 ◽  
Author(s):  
David Gueorguiev ◽  
Julien Lambert ◽  
Jean-Louis Thonnard ◽  
Katherine J. Kuchenbecker

Abstract Humans need to accurately process the contact forces that arise as they perform everyday haptic interactions, but the mechanisms by which the forces on the skin are represented and integrated remain little understood. In this study, we used a force-controlled robotic platform and simultaneous ultrasonic modulation of the finger-surface friction to briefly and independently manipulate the normal and tangential forces during passive haptic stimulation by a flat surface. When participants were asked whether the contact pressure on their finger had briefly increased or decreased, they could not distinguish the normal force from the tangential force. Instead, they integrated the normal and tangential components of the force vector into a multidimensional computation of the contact force. We additionally investigated whether participants relied on three common contact-force metrics. Interestingly, the change in the amplitude of the force vector predicted participants’ responses better than the change of the coefficient of dynamic friction and the change of the angle of the contact force vector. Thus, intensive cues related to the amplitude of the applied force may be meaningful for the sensing of contact pressure during haptic stimulation by a moving surface.


2021 ◽  
Author(s):  
Yongxuan Tan ◽  
Sibylle Rérolle ◽  
Thilina Dulantha Lalitharatne ◽  
Nejra Van Zalk ◽  
Rachael E. Jack ◽  
...  

Abstract Medical training simulators can provide a safe and controlled environment for medical students to practice their physical examination skills. Visual feedback of involuntary pain expressions in response to physical palpation on an affected area of a patient is an important source of information for physicians. However, most existing robotic medical training simulators that can capture physical examination behaviours in real-time cannot display facial expressions or comprise a limited range of patient identities in terms of ethnicity and gender. Together, these limitations restrict the utility of medical training simulators because they do not provide medical students with a representative diversity both of pain facial expressions and face identities, which could result in biased practice. Further, these limitations restrict the utility of such medical simulators to be used to detect and correct early signs of bias in medical training. Here, for the first time, we present a robotic system that can simulate facial expressions of pain in response to palpations, displayed on a range of patient face identities. We use the unique approach of modelling dynamic pain facial expressions using the data-driven psychophysical method of reverse correlation and incorporating the visuo-haptic interactions of users performing palpation to a robot medical simulator. Specifically, participants performed palpation actions on the abdomen phantom of simulated patients, which triggered the real-time display of 6 pain-related facial Action Units (AUs) on a robotic face (MorphFace), each controlled by two pseudo randomly generated transient parameters: rate of change β and activation delay τ. Participants then rated the appropriateness of the facial expression displayed in response to their palpations on a 4-point scale. Each participant (n = 16, 4 Asian female, 4 Asian male, 4 White female and 4 White male) performed 200 palpation trials on 4 patient identities (Black female, Black male, White female and White male) simulated using MorphFace. Results showed that a gradual decrease of β and increase of τ from upper face AUs (around the eyes) to those in the lower face (around the mouth) is rated to be appropriate by all participants. We found that transient parameter values that generated the appropriate pain facial expressions as rated by participants, palpation forces, and delays between palpation actions varied across gender and ethnicity of participant-simulated patient pairs. These findings suggest that gender and ethnicity biases affect the participants’ palpation strategies and their perception of the pain facial expressions displayed on MorphFace. We anticipate our approach could be utilised to generate physical examination models with diverse patient demographic groups to reduce erroneous judgments in medical students, and provide focused training to address these errors.


2021 ◽  
Vol 15 ◽  
Author(s):  
Shirley Handelzalts ◽  
Giulia Ballardini ◽  
Chen Avraham ◽  
Mattia Pagano ◽  
Maura Casadio ◽  
...  

The COVID-19 pandemic has highlighted the need for advancing the development and implementation of novel means for home-based telerehabilitation in order to enable remote assessment and training for individuals with disabling conditions in need of therapy. While somatosensory input is essential for motor function, to date, most telerehabilitation therapies and technologies focus on assessing and training motor impairments, while the somatosensorial aspect is largely neglected. The integration of tactile devices into home-based rehabilitation practice has the potential to enhance the recovery of sensorimotor impairments and to promote functional gains through practice in an enriched environment with augmented tactile feedback and haptic interactions. In the current review, we outline the clinical approaches for stimulating somatosensation in home-based telerehabilitation and review the existing technologies for conveying mechanical tactile feedback (i.e., vibration, stretch, pressure, and mid-air stimulations). We focus on tactile feedback technologies that can be integrated into home-based practice due to their relatively low cost, compact size, and lightweight. The advantages and opportunities, as well as the long-term challenges and gaps with regards to implementing these technologies into home-based telerehabilitation, are discussed.


2021 ◽  
pp. 373-388
Author(s):  
Xin Xin ◽  
Yiji Wang ◽  
Nan Liu ◽  
Wenmin Yang ◽  
Hang Dong ◽  
...  
Keyword(s):  

Author(s):  
Monica Liu ◽  
Aaron Paul Batista ◽  
Sliman J Bensmaia ◽  
Douglas John Weber

Tactile nerve fibers convey information about many features of haptic interactions, including the force and speed of contact, as well as the texture and shape of the objects being handled. How we perceive these object features is relatively unaffected by the forces and movements we use when interacting with the object. Since signals related to contact events and object properties are mixed in the responses of tactile fibers, our ability to disentangle these different components of our tactile experience implies that they are demultiplexed as they propagate along the neuraxis. To understand how texture and contact mechanics are encoded together by tactile fibers, we studied the activity of multiple neurons recorded simultaneously in the cervical dorsal root ganglia (DRG) of two anesthetized Rhesus monkeys while textured surfaces were applied to the glabrous skin of the fingers and palm using a handheld probe. A transducer at the tip of the textured probe measured contact forces as tactile stimuli were applied at different locations on the finger-pads and palm. We examined how a sample population of DRG neurons encode force and texture and found that firing rates of individual neurons are modulated by both force and texture. In particular, slowly-adapting (SA) neurons were more responsive to force than texture, and rapidly-adapting (RA) neurons were more responsive to texture than force. While force could be decoded accurately throughout the entire contact interval, texture signals were most salient during onset and offset phases of the contact interval.


2020 ◽  
Vol 6 (3) ◽  
pp. 571-574
Author(s):  
Anna Schaufler ◽  
Alfredo Illanes ◽  
Ivan Maldonado ◽  
Axel Boese ◽  
Roland Croner ◽  
...  

AbstractIn robot-assisted procedures, the surgeon controls the surgical instruments from a remote console, while visually monitoring the procedure through the endoscope. There is no haptic feedback available to the surgeon, which impedes the assessment of diseased tissue and the detection of hidden structures beneath the tissue, such as vessels. Only visual clues are available to the surgeon to control the force applied to the tissue by the instruments, which poses a risk for iatrogenic injuries. Additional information on haptic interactions of the employed instruments and the treated tissue that is provided to the surgeon during robotic surgery could compensate for this deficit. Acoustic emissions (AE) from the instrument/tissue interactions, transmitted by the instrument are a potential source of this information. AE can be recorded by audio sensors that do not have to be integrated into the instruments, but that can be modularly attached to the outside of the instruments shaft or enclosure. The location of the sensor on a robotic system is essential for the applicability of the concept in real situations. While the signal strength of the acoustic emissions decreases with distance from the point of interaction, an installation close to the patient would require sterilization measures. The aim of this work is to investigate whether it is feasible to install the audio sensor in non-sterile areas far away from the patient and still be able to receive useful AE signals. To determine whether signals can be recorded at different potential mounting locations, instrument/tissue interactions with different textures were simulated in an experimental setup. The results showed that meaningful and valuable AE can be recorded in the non-sterile area of a robotic surgical system despite the expected signal losses.


Sign in / Sign up

Export Citation Format

Share Document