tactile interfaces
Recently Published Documents


TOTAL DOCUMENTS

45
(FIVE YEARS 11)

H-INDEX

7
(FIVE YEARS 1)

2021 ◽  
Vol 5 (ISS) ◽  
pp. 1-17
Author(s):  
Yosra Rekik ◽  
Edward Lank ◽  
Adnane Guettaf ◽  
Prof. Laurent Grisoni

Alongside vision and sound, hardware systems can be readily designed to support various forms of tactile feedback; however, while a significant body of work has explored enriching visual and auditory communication with interactive systems, tactile information has not received the same level of attention. In this work, we explore increasing the expressivity of tactile feedback by allowing the user to dynamically select between several channels of tactile feedback using variations in finger speed. In a controlled experiment, we show that a user can learn the dynamics of eyes-free tactile channel selection among different channels, and can reliable discriminate between different tactile patterns during multi-channel selection with an accuracy up to 90% when using two finger speed levels. We discuss the implications of this work for richer, more interactive tactile interfaces.


2021 ◽  
Author(s):  
Hanbit Jin ◽  
Yunjeong Kim ◽  
Wooseup Youm ◽  
Yulim Min ◽  
Chaehyun Lim ◽  
...  

Abstract For highly immersive telehaptic applications, skin-integrated, untethered, and highly pixelated transducer devices that can record and generate tactile stimuli are required. Here, we propose a skin-conformable tactile sensor and actuator array with high spatial resolution of 1.8 mm for realising untethered tactile communication on human skin. The tactile sensors are designed to exhibit ultra-flexibility and bimodal sensitivity to static and dynamic pressure. The actuators are miniaturised to sub-millimetre scale to provide sophisticated, high spatiotemporal resolution tactile feedback over a centimetre square area of the fingertip with the capacity to generate vibrotactile feedback under an external load of up to 529 kPa. Short time Fourier transform analysis showed that our telehaptic system can transmit various types of tactile stimuli, such as the shape of objects and letters, textures of fabrics, and vibration patterns with high fidelity.


2021 ◽  
Vol 15 ◽  
Author(s):  
Mehmet Ege Cansev ◽  
Daniel Nordheimer ◽  
Elsa Andrea Kirchner ◽  
Philipp Beckerle

Previous research has shown the value of the sense of embodiment, i.e., being able to integrate objects into one's bodily self-representation, and its connection to (assistive) robotics. Especially, tactile interfaces seem essential to integrate assistive robots into one's body model. Beyond functional feedback, such as tactile force sensing, the human sense of touch comprises specialized nerves for affective signals, which transmit positive sensations during slow and low-force tactile stimulations. Since these signals are extremely relevant for body experience as well as social and emotional contacts but scarcely considered in recent assistive devices, this review provides a requirement analysis to consider affective touch in engineering design. By analyzing quantitative and qualitative information from engineering, cognitive psychology, and neuroscienctific research, requirements are gathered and structured. The resulting requirements comprise technical data such as desired motion or force/torque patterns and an evaluation of potential stimulation modalities as well as their relations to overall user experience, e.g., pleasantness and realism of the sensations. This review systematically considers the very specific characteristics of affective touch and the corresponding parts of the neural system to define design goals and criteria. Based on the analysis, design recommendations for interfaces mediating affective touch are derived. This includes a consideration of biological principles and human perception thresholds which are complemented by an analysis of technical possibilities. Finally, we outline which psychological factors can be satisfied by the mediation of affective touch to increase acceptance of assistive devices and outline demands for further research and development.


2021 ◽  
Vol 26 (1) ◽  
pp. 89-99
Author(s):  
Christophe Lengelé

This article describes my own way to improvise with space using a computer-based tool implemented in SuperCollider. The objective of this spatial performance tool is to have an ergonomic spatio-temporal and spectral control over numerous sound objects in real time, in order to alternate between spatialised polyrhythms and textures. After a brief review of spatial audio context, the spatial performance tool is summarised and detailed here by focusing on one of the core parameters: the playback speeds, which can act both on rhythm and space and enable among others the spatio-temporal articulation of the performance. As well as discussing the word ‘comprovisation’ and my conception of human–computer improvisation, the possibilities and approach of the tool in terms of improvisation and controllerism are illustrated through the use and combination of different controllers (computer keyboard, tactile interfaces, force touch sensors). Whereas some controllers are more dedicated to the selection and triggering of streams of spatialised sound events, others have their own mappings and ways of acting on some parameters (depending on the temporality of the sounds: playing or future events).


Soft Matter ◽  
2021 ◽  
Author(s):  
Abigail Nolin ◽  
Amanda Licht ◽  
Kelly Pierson ◽  
Chun-Yuan Lo ◽  
Laure V. Kayser ◽  
...  

We control the sense of touch through materials chemistry. To find tactile materials, we developed methods to screen materials and found that humans could distinguish surface monolayers which differed by a single atom substitution.


Computers ◽  
2020 ◽  
Vol 9 (3) ◽  
pp. 75 ◽  
Author(s):  
Ruben Contreras ◽  
Angel Ayala ◽  
Francisco Cruz

Currently, unmanned aerial vehicles, such as drones, are becoming a part of our lives and extend to many areas of society, including the industrialized world. A common alternative for controlling the movements and actions of the drone is through unwired tactile interfaces, for which different remote control devices are used. However, control through such devices is not a natural, human-like communication interface, which sometimes is difficult to master for some users. In this research, we experimented with a domain-based speech recognition architecture to effectively control an unmanned aerial vehicle such as a drone. The drone control was performed in a more natural, human-like way to communicate the instructions. Moreover, we implemented an algorithm for command interpretation using both Spanish and English languages, as well as to control the movements of the drone in a simulated domestic environment. We conducted experiments involving participants giving voice commands to the drone in both languages in order to compare the effectiveness of each, considering the mother tongue of the participants in the experiment. Additionally, different levels of distortion were applied to the voice commands to test the proposed approach when it encountered noisy input signals. The results obtained showed that the unmanned aerial vehicle was capable of interpreting user voice instructions. Speech-to-action recognition improved for both languages with phoneme matching in comparison to only using the cloud-based algorithm without domain-based instructions. Using raw audio inputs, the cloud-based approach achieves 74.81% and 97.04% accuracy for English and Spanish instructions, respectively. However, with our phoneme matching approach the results are improved, yielding 93.33% accuracy for English and 100.00% accuracy for Spanish.


2020 ◽  
Author(s):  
Victor Oliveira ◽  
Anderson Maciel

This paper summarizes our study of tactile languages in humancomputer interaction. We intended to analyze how the choices made during the design process of tactile vocabularies would affect the user performance on an interactive task. Therefore, we have developed and tested different sets of tactile signals for aid navigation in virtual environments. It leaded us to fashion a novel approach for vibrotactile prefixation. Through this experimental-driven study, we attempted to effects of multisensory stimulation, perception, learning and interpretation of tactile sequences, and masking caused by multiple vibrations in a same locus. The presented results should be useful for other designers to produce usable and expressive tactile interfaces.


2019 ◽  
Vol 12 (2) ◽  
pp. 145-162
Author(s):  
Mihaela Ştir ◽  
Adriana Zaiţ

AbstractOnline sales increase at incredible paces, all over the world, and so are corresponding marketing efforts. One of the main deterrents of online selling is related to the impossibility of trying or touching products before taking the decision to buy. Previous studies on offline environments have proved that touching products makes people develop a feeling of ownership, a psychological sense of property that has positive consequences on their intention and decision to buy those products. Similar effects, adapted for the online environments, were less investigated, but the very few existent studies suggest that virtually touching a product through tactile interfaces (smartphone, iPad, tablet etc.) could be as important for consumer decisions as the content of the site and product information. Virtual touching could serve as emotional triggers, leading to feelings of ownership and endowment effects in online marketing. However, defining the concept of “virtual touching” is difficult – even the simple association of “touch” and “virtual” seems oximoronic.The purpose of the present study – a literature review type – is to investigate the tactile based creative online marketing, in order to conceptualize and operationalize the variable „virtual touching”, thus being able to further suggest a research design which would enable us to measure the impact of online „touching” on consumer behaviour. The main analysed constructs related to virtual touching are: endowment effect, psychological ownership, haptic advertising, sensory online marketing, haptic imagery, haptic technology, reverse electrovibration.


Author(s):  
Nuphar Katzman ◽  
Tal Oron-Gilad

Vibro-tactile interfaces can support users in various aspects and contexts. Despite their inherent advantages, it is important to realize that they are limited in the type and capacity of information they can convey. This study is part of a series of experiments that aim to develop and evaluate a “tactile taxonomy” for dismounted operational environments. The current experiment includes a simulation of an operational mission with a remote Unmanned Ground Vehicle (UGV). During the mission, 20 participants were required to interpret notifications that they received in one (or more) of the following modalities: auditory, visual and/or tactile. Three specific notification types were chosen based on previous studies, in order to provide an intuitive connection between the notification and its semantic meaning. Response times to notifications, the ability to distinguish between the information types that they provided, and the operational mission performance metrics, were collected. Results indicate that it is possible to use a limited “tactile taxonomy” in a visually loaded and auditory noisy scene while performing a demanding operational task. The use of the tactile modality with other sensory modalities leverages the participants’ ability to perceive and identify the notifications.


2019 ◽  
Vol 11 (1) ◽  
pp. 01-09
Author(s):  
George Kokkonis ◽  
Vasileios Moysiadis ◽  
Theofano Kollatou ◽  
Sotirios Kontogiannis

Sign in / Sign up

Export Citation Format

Share Document