Multimodal Mouse: A Mouse-Type Device with Tactile and Force Display

1994 ◽  
Vol 3 (1) ◽  
pp. 73-80 ◽  
Author(s):  
Motoyuki Akamatsu ◽  
Sigeru Sato ◽  
I. Scott MacKenzie

A mouse was modified to add tactile and force display. Tactile feedback, or display, was added via a solenoid driving a small pin protruding through a hole in the mouse button. Force feedback was added via an electromagnet and an iron mouse pad. Both enhancements were embedded in the mouse casing, increasing its weight from 103 to 148 g. In a target selection task experiment, the addition of tactile information feedback reduced target selection times slightly, compared to the no additional feedback condition. A more pronounced effect was observed on the clicking time—the time to selection once the cursor entered the target. In this case, we observed a statistically significant speed-up of about 12% in the presence of tactile feedback. The modified mouse was also used in a test of virtual texture. The amplitude and frequency of solenoid pulses were varied according to the movement of the mouse and the underlying virtual texture. Subjects could reliably discriminate between different textures.

2021 ◽  
Vol 18 (2) ◽  
pp. 1-13
Author(s):  
Wanjoo Park ◽  
Muhammad Hassan Jamil ◽  
Ruth Ghidey Gebremedhin ◽  
Mohamad Eid

The use of haptic technologies has recently become immensely essential in Human-Computer Interaction to improve user experience and performance. With the introduction of tactile feedback on a touchscreen device, commonly known as surface haptics, several applications and interaction paradigms have become a reality. However, the effects of tactile feedback on the preference of 2D images in visuo-tactile exploration task on touchscreen devices remain largely unknown. In this study, we investigated differences of preference score (the tendency of participants to like/dislike a 2D image based on its visual and tactile properties), reach time, interaction time, and response time under four conditions of feedback: no tactile feedback, high-quality of tactile information (sharp tactile texture), low-quality of tactile information (blurred tactile texture), and incorrect tactile information (mismatch tactile texture). The tactile feedback is rendered in the form of roughness that is simulated by modulating the friction between the finger and the surface and is derived from the 2D image. Thirty-six participants completed visuo-tactile exploration tasks for a total of 36 trials (3 2D images × 4 tactile textures × 3 repetitions). Results showed that the presence of tactile feedback enhanced users’ preference (tactile feedback conditions were rated significantly higher than the no tactile feedback condition for preference regardless of the quality/correctness of tactile feedback). This finding is also supported through results from self-reporting where 88.89% of participants preferred to experience the 2D image with tactile feedback. Additionally, the presence of tactile feedback resulted in significantly larger interaction time and response time compared to the no tactile feedback condition. Furthermore, the quality and correctness of tactile information significantly impacted the preference rating (sharp tactile textures were rated statistically higher than blurred tactile and mismatched tactile textures). All of these findings demonstrate that tactile feedback plays a crucial role in users’ preference and thus motivates further the development of surface haptic technologies.


Author(s):  
Atena Fadaei Jouybari ◽  
Matteo Franza ◽  
Oliver Alan Kannape ◽  
Masayuki Hara ◽  
Olaf Blanke

AbstractThere is a steadily growing number of mobile communication systems that provide spatially encoded tactile information to the humans’ torso. However, the increased use of such hands-off displays is currently not matched with or supported by systematic perceptual characterization of tactile spatial discrimination on the torso. Furthermore, there are currently no data testing spatial discrimination for dynamic force stimuli applied to the torso. In the present study, we measured tactile point localization (LOC) and tactile direction discrimination (DIR) on the thoracic spine using two unisex torso-worn tactile vests realized with arrays of 3 × 3 vibrotactile or force feedback actuators. We aimed to, first, evaluate and compare the spatial discrimination of vibrotactile and force stimulations on the thoracic spine and, second, to investigate the relationship between the LOC and DIR results across stimulations. Thirty-four healthy participants performed both tasks with both vests. Tactile accuracies for vibrotactile and force stimulations were 60.7% and 54.6% for the LOC task; 71.0% and 67.7% for the DIR task, respectively. Performance correlated positively with both stimulations, although accuracies were higher for the vibrotactile than for the force stimulation across tasks, arguably due to specific properties of vibrotactile stimulations. We observed comparable directional anisotropies in the LOC results for both stimulations; however, anisotropies in the DIR task were only observed with vibrotactile stimulations. We discuss our findings with respect to tactile perception research as well as their implications for the design of high-resolution torso-mounted tactile displays for spatial cueing.


2021 ◽  
Vol 5 (ISS) ◽  
pp. 1-17
Author(s):  
Yosra Rekik ◽  
Edward Lank ◽  
Adnane Guettaf ◽  
Prof. Laurent Grisoni

Alongside vision and sound, hardware systems can be readily designed to support various forms of tactile feedback; however, while a significant body of work has explored enriching visual and auditory communication with interactive systems, tactile information has not received the same level of attention. In this work, we explore increasing the expressivity of tactile feedback by allowing the user to dynamically select between several channels of tactile feedback using variations in finger speed. In a controlled experiment, we show that a user can learn the dynamics of eyes-free tactile channel selection among different channels, and can reliable discriminate between different tactile patterns during multi-channel selection with an accuracy up to 90% when using two finger speed levels. We discuss the implications of this work for richer, more interactive tactile interfaces.


Author(s):  
Henrik Skovsgaard ◽  
Kari-Jouko Räihä ◽  
Martin Tall

This chapter provides an overview of gaze-based interaction techniques. We will first explore specific techniques intended to make target selection easier and to avoid the Midas touch problem. We will then take a look at techniques that do not require the use of special widgets in the interface but instead manipulate the rendering on the basis of eye gaze to facilitate the selection of small targets. Dwell-based interaction makes use of fixations; recent research has looked into the other option, using saccades as the basis for eye gestures. We will also discuss examples of how eye gaze has been used with other input modalities (blinks and winks, keyboard and mouse, facial gestures, head movements, and speech) to speed up interaction. Finally, we will discuss examples of interaction techniques in the context of a specific area of application: navigating information spaces.


2019 ◽  
Vol 4 (27) ◽  
pp. eaau8892 ◽  
Author(s):  
Edoardo D’Anna ◽  
Giacomo Valle ◽  
Alberto Mazzoni ◽  
Ivo Strauss ◽  
Francesco Iberite ◽  
...  

Current myoelectric prostheses allow transradial amputees to regain voluntary motor control of their artificial limb by exploiting residual muscle function in the forearm. However, the overreliance on visual cues resulting from a lack of sensory feedback is a common complaint. Recently, several groups have provided tactile feedback in upper limb amputees using implanted electrodes, surface nerve stimulation, or sensory substitution. These approaches have led to improved function and prosthesis embodiment. Nevertheless, the provided information remains limited to a subset of the rich sensory cues available to healthy individuals. More specifically, proprioception, the sense of limb position and movement, is predominantly absent from current systems. Here, we show that sensory substitution based on intraneural stimulation can deliver position feedback in real time and in conjunction with somatotopic tactile feedback. This approach allowed two transradial amputees to regain high and close-to-natural remapped proprioceptive acuity, with a median joint angle reproduction precision of 9.1° and a median threshold to detection of passive movements of 9.5°, which was comparable with results obtained in healthy participants. The simultaneous delivery of position information and somatotopic tactile feedback allowed both amputees to discriminate the size and compliance of four objects with high levels of performance (75.5%). These results demonstrate that tactile information delivered via somatotopic neural stimulation and position information delivered via sensory substitution can be exploited simultaneously and efficiently by transradial amputees. This study paves a way to more sophisticated bidirectional bionic limbs conveying richer, multimodal sensations.


Robotica ◽  
2002 ◽  
Vol 20 (2) ◽  
pp. 213-221 ◽  
Author(s):  
Vicente Mut ◽  
José Postigo ◽  
Emanuel Slawiñski ◽  
Benjamin Kuchen

A control structure for the bilateral teleoperation of mobile robots, with tactile feedback and visual information of the interaction force is proposed in this paper. Also an impedance controller is implemented in the mobile robot structure that guarantees the linear velocity be within a desired fixed range without saturation in the actuators. To illustrate the performance of the proposed control structure, experiments on a Pioneer 2 mobile robot teleoperated with a commercial joystick with force feedback are shown.


2020 ◽  
Vol 34 (10) ◽  
pp. 891-903
Author(s):  
Shu-Han Yu ◽  
Ruey-Meei Wu ◽  
Cheng-Ya Huang

Background Restricted attentional resource and central processing in patients with Parkinson’s disease (PD) may reduce the benefit of visual feedback in a dual task. Objectives Using brain event-related potentials (ERPs), this study aims to investigate the neural mechanisms of posture visual feedback and supraposture visual feedback during performing of a posture-motor dual task. Methods Eighteen patients with PD and 18 healthy controls stood on a mobile platform (postural task) and executed a manual force-matching task (suprapostural task) concurrently with provided visual feedback of platform movement (posture-feedback condition) or force output (force-feedback condition). The platform movement, force-matching performance, and ERPs (P1, N1, and P2 waves) were recorded. Results Both PD and control groups had superior force accuracy in the force-feedback condition. Decreased postural sway by posture-feedback was observed in healthy controls but not in PD. Force-feedback led to a greater frontal area N1 peak in PD group but smaller N1 peaks in control group. In addition, force-feedback led to smaller P2 peaks of the frontal and sensorimotor areas among PD patients but greater P2 peaks of the sensorimotor and parietal-occipital areas among healthy controls. However, P1 modulations was present only in healthy controls. Conclusions Force-feedback had positive effect on force accuracy in both PD and healthy individuals; however, the beneficial effect of posture-feedback on posture balance is not observed in PD. These findings are the first to suggest that PD could recruit more attentional resources in dual-task preparation to enhance suprapostural accuracy and avoid degrading postural stability by supraposture visual feedback.


2006 ◽  
Vol 5-6 ◽  
pp. 55-62
Author(s):  
I.A. Jones ◽  
A.A. Becker ◽  
A.T. Glover ◽  
P. Wang ◽  
S.D. Benford ◽  
...  

Boundary element (BE) analysis is well known as a tool for assessing the stiffness and strength of engineering components, but, along with finite element (FE) techniques, it is also finding new applications as a means of simulating the behaviour of deformable objects within virtual reality simulations since it exploits precisely the same kind of surface-only definition used for visual rendering of three-dimensional solid objects. This paper briefly reviews existing applications of BE and FE within virtual reality, and describes recent work on the BE-based simulation of aspects of surgical operations on the brain, making use of commercial hand-held force-feedback interfaces (haptic devices) to measure the positions of the virtual surgical tools and provide tactile feedback to the user. The paper presents an overview of the project then concentrates on recent developments, including the incorporation of simulated tumours in the virtual brain.


2014 ◽  
Vol 39 (3) ◽  
pp. 204-212 ◽  
Author(s):  
Heidi JB Witteveen ◽  
Hans S Rietman ◽  
Peter H Veltink

Background: User feedback about grasping force and hand aperture is very important in object handling with myoelectric forearm prostheses but is lacking in current prostheses. Vibrotactile feedback increases the performance of healthy subjects in virtual grasping tasks, but no extensive validation on potential users has been performed. Objectives: Investigate the performance of upper-limb loss subjects in grasping tasks with vibrotactile stimulation, providing hand aperture, and grasping force feedback. Study design: Cross-over trial. Methods: A total of 10 subjects with upper-limb loss performed virtual grasping tasks while perceiving vibrotactile feedback. Hand aperture feedback was provided through an array of coin motors and grasping force feedback through a single miniature stimulator or an array of coin motors. Objects with varying sizes and weights had to be grasped by a virtual hand. Results: Percentages correctly applied hand apertures and correct grasping force levels were all higher for the vibrotactile feedback condition compared to the no-feedback condition. With visual feedback, the results were always better compared to the vibrotactile feedback condition. Task durations were comparable for all feedback conditions. Conclusion: Vibrotactile grasping force and hand aperture feedback improves grasping performance of subjects with upper-limb loss. However, it should be investigated whether this is of additional value in daily-life tasks. Clinical relevance This study is a first step toward the implementation of sensory vibrotactile feedback for users of myoelectric forearm prostheses. Grasping force feedback is crucial for optimal object handling, and hand aperture feedback is essential for reduction of required visual attention. Grasping performance with feedback is evaluated for the potential users.


Sign in / Sign up

Export Citation Format

Share Document