A Study of Recording and Playback of Tactile Information

Author(s):  
Masahiro OHKA
Keyword(s):  
2020 ◽  
Vol 11 ◽  
Author(s):  
Chao Huang ◽  
Qizhuo Wang ◽  
Mingfu Zhao ◽  
Chunyan Chen ◽  
Sinuo Pan ◽  
...  

Minimally invasive surgery (MIS) has been the preferred surgery approach owing to its advantages over conventional open surgery. As a major limitation, the lack of tactile perception impairs the ability of surgeons in tissue distinction and maneuvers. Many studies have been reported on industrial robots to perceive various tactile information. However, only force data are widely used to restore part of the surgeon’s sense of touch in MIS. In recent years, inspired by image classification technologies in computer vision, tactile data are represented as images, where a tactile element is treated as an image pixel. Processing raw data or features extracted from tactile images with artificial intelligence (AI) methods, including clustering, support vector machine (SVM), and deep learning, has been proven as effective methods in industrial robotic tactile perception tasks. This holds great promise for utilizing more tactile information in MIS. This review aims to provide potential tactile perception methods for MIS by reviewing literatures on tactile sensing in MIS and literatures on industrial robotic tactile perception technologies, especially AI methods on tactile images.


1992 ◽  
Vol 68 (2) ◽  
pp. 518-527 ◽  
Author(s):  
T. P. Pons ◽  
P. E. Garraghty ◽  
M. Mishkin

1. Selective ablations of the hand representations in postcentral cortical areas 3a, 3b, 1, and 2 were made in different combinations to determine each area's contribution to the responsivity and modality properties of neurons in the hand representation in SII. 2. Ablations that left intact only the postcentral areas that process predominantly cutaneous inputs (i.e., areas 3b and 1) yielded SII recording sites responsive to cutaneous stimulation and none driven exclusively by high-intensity or "deep" stimulation. Conversely, ablations that left intact only the postcentral areas that process predominantly deep receptor inputs (i.e., areas 3a and 2) yielded mostly SII recording sites that responded exclusively to deep stimulation. 3. Ablations that left intact only area 3a or only area 2 yielded substantial and roughly equal reductions in the number of deep receptive fields in SII. By contrast, ablations that left intact only area 3b or only area 1 yielded unequal reductions in the number of cutaneous receptive fields in SII: a small reduction when area 3b alone was intact but a somewhat larger one when only area 1 was intact. 4. Finally, when the hand representation in area 3b was ablated, leaving areas 3a, 1, and 2 fully intact, there was again a substantial reduction in the encounter rate of cutaneous receptive fields. 5. The partial ablations often led to unresponsive sites in the SII hand representation. In SII representations other than of the hand no such unresponsive sites were found and there were no substantial changes in the ratio of cutaneous to deep receptive fields, indicating that the foregoing results were not due to long-lasting postsurgical depression or effects of anesthesia. 6. The findings indicate that modality-specific information is relayed from postcentral cortical areas to SII along parallel channels, with cutaneous inputs transmitted via areas 3b and 1, and deep inputs via areas 3a and 2. Further, area 3b provides the major source of cutaneous input to SII, directly and perhaps also via area 1. 7. The results are in line with accumulating anatomic and electrophysiologic evidence pointing to an evolutionary shift in the organization of the somatosensory system from the general mammalian plan, in which tactile information is processed in parallel in SI and SII, to a new organization in higher primates in which the processing of tactile information proceeds serially from SI to SII. The presumed functional advantages of this evolutionary shift are unknown.


Author(s):  
Wataru Fukui ◽  
Futoshi Kobayashi ◽  
Fumio Kojima ◽  
Hiroyuki Nakamoto ◽  
Tadashi Maeda ◽  
...  

Author(s):  
Atena Fadaei Jouybari ◽  
Matteo Franza ◽  
Oliver Alan Kannape ◽  
Masayuki Hara ◽  
Olaf Blanke

AbstractThere is a steadily growing number of mobile communication systems that provide spatially encoded tactile information to the humans’ torso. However, the increased use of such hands-off displays is currently not matched with or supported by systematic perceptual characterization of tactile spatial discrimination on the torso. Furthermore, there are currently no data testing spatial discrimination for dynamic force stimuli applied to the torso. In the present study, we measured tactile point localization (LOC) and tactile direction discrimination (DIR) on the thoracic spine using two unisex torso-worn tactile vests realized with arrays of 3 × 3 vibrotactile or force feedback actuators. We aimed to, first, evaluate and compare the spatial discrimination of vibrotactile and force stimulations on the thoracic spine and, second, to investigate the relationship between the LOC and DIR results across stimulations. Thirty-four healthy participants performed both tasks with both vests. Tactile accuracies for vibrotactile and force stimulations were 60.7% and 54.6% for the LOC task; 71.0% and 67.7% for the DIR task, respectively. Performance correlated positively with both stimulations, although accuracies were higher for the vibrotactile than for the force stimulation across tasks, arguably due to specific properties of vibrotactile stimulations. We observed comparable directional anisotropies in the LOC results for both stimulations; however, anisotropies in the DIR task were only observed with vibrotactile stimulations. We discuss our findings with respect to tactile perception research as well as their implications for the design of high-resolution torso-mounted tactile displays for spatial cueing.


Author(s):  
Yuyeol JUN ◽  
Gang YAN ◽  
Satoshi FUNABASHI ◽  
Alexander SCHMITZ ◽  
Shigeki SUGANO

2021 ◽  
Author(s):  
Anthony Renard ◽  
Evan Harrell ◽  
Brice Bathallier

Abstract Rodents depend on olfaction and touch to meet many of their fundamental needs. The joint significance of these sensory systems is underscored by an intricate coupling between sniffing and whisking. However, the impact of simultaneous olfactory and tactile inputs on sensory representations in the cortex remains elusive. To study these interactions, we recorded large populations of barrel cortex neurons using 2-photon calcium imaging in head-fixed mice during olfactory and tactile stimulation. We find that odors alter barrel cortex activity in at least two ways, first by enhancing whisking, and second by central cross-talk that persists after whisking is abolished by facial nerve sectioning. Odors can either enhance or suppress barrel cortex neuronal responses, and while odor identity can be decoded from population activity, it does not interfere with the tactile representation. Thus, barrel cortex represents olfactory information which, in the absence of learned associations, is coded independently of tactile information.


2021 ◽  
Vol 5 (ISS) ◽  
pp. 1-17
Author(s):  
Yosra Rekik ◽  
Edward Lank ◽  
Adnane Guettaf ◽  
Prof. Laurent Grisoni

Alongside vision and sound, hardware systems can be readily designed to support various forms of tactile feedback; however, while a significant body of work has explored enriching visual and auditory communication with interactive systems, tactile information has not received the same level of attention. In this work, we explore increasing the expressivity of tactile feedback by allowing the user to dynamically select between several channels of tactile feedback using variations in finger speed. In a controlled experiment, we show that a user can learn the dynamics of eyes-free tactile channel selection among different channels, and can reliable discriminate between different tactile patterns during multi-channel selection with an accuracy up to 90% when using two finger speed levels. We discuss the implications of this work for richer, more interactive tactile interfaces.


Sign in / Sign up

Export Citation Format

Share Document