scholarly journals Visual Feedback And Pseudo–Haptic Feedback Improve Manual Lifting Performance

2008 ◽  
Vol 49 (1) ◽  
Author(s):  
Faieza Abdul Aziz ◽  
D. T. Pham ◽  
Shamsuddin Sulaiman ◽  
Napsiah Ismail ◽  
Mohd Khairol Anuar Ariffin ◽  
...  
2019 ◽  
Vol 121 (4) ◽  
pp. 1543-1560 ◽  
Author(s):  
Robert W. Nickl ◽  
M. Mert Ankarali ◽  
Noah J. Cowan

Volitional rhythmic motor behaviors such as limb cycling and locomotion exhibit spatial and timing regularity. Such rhythmic movements are executed in the presence of exogenous visual and nonvisual cues, and previous studies have shown the pivotal role that vision plays in guiding spatial and temporal regulation. However, the influence of nonvisual information conveyed through auditory or touch sensory pathways, and its effect on control, remains poorly understood. To characterize the function of nonvisual feedback in rhythmic arm control, we designed a paddle juggling task in which volunteers bounced a ball off a rigid elastic surface to a target height in virtual reality by moving a physical handle with the right hand. Feedback was delivered at two key phases of movement: visual feedback at ball peaks only and simultaneous audio and haptic feedback at ball-paddle collisions. In contrast to previous work, we limited visual feedback to the minimum required for jugglers to assess spatial accuracy, and we independently perturbed the spatial dimensions and the timing of feedback. By separately perturbing this information, we evoked dissociable effects on spatial accuracy and timing, confirming that juggling, and potentially other rhythmic tasks, involves two complementary processes with distinct dynamics: spatial error correction and feedback timing synchronization. Moreover, we show evidence that audio and haptic feedback provide sufficient information for the brain to control the timing synchronization process by acting as a metronome-like cue that triggers hand movement. NEW & NOTEWORTHY Vision contains rich information for control of rhythmic arm movements; less is known, however, about the role of nonvisual feedback (touch and sound). Using a virtual ball bouncing task allowing independent real-time manipulation of spatial location and timing of cues, we show their dissociable roles in regulating motor behavior. We confirm that visual feedback is used to correct spatial error and provide new evidence that nonvisual event cues act to reset the timing of arm movements.


2005 ◽  
Vol 14 (6) ◽  
pp. 677-696 ◽  
Author(s):  
Christoph W. Borst ◽  
Richard A. Volz

We present a haptic feedback technique that combines feedback from a portable force-feedback glove with feedback from direct contact with rigid passive objects. This approach is a haptic analogue of visual mixed reality, since it can be used to haptically combine real and virtual elements in a single display. We discuss device limitations that motivated this combined approach and summarize technological challenges encountered. We present three experiments to evaluate the approach for interactions with buttons and sliders on a virtual control panel. In our first experiment, this approach resulted in better task performance and better subjective ratings than the use of only a force-feedback glove. In our second experiment, visual feedback was degraded and the combined approach resulted in better performance than the glove-only approach and in better ratings of slider interactions than both glove-only and passive-only approaches. A third experiment allowed subjective comparison of approaches and provided additional evidence that the combined approach provides the best experience.


2021 ◽  
Vol 15 ◽  
Author(s):  
Charles H. Moore ◽  
Sierra F. Corbin ◽  
Riley Mayr ◽  
Kevin Shockley ◽  
Paula L. Silva ◽  
...  

Upper-limb prostheses are subject to high rates of abandonment. Prosthesis abandonment is related to a reduced sense of embodiment, the sense of self-location, agency, and ownership that humans feel in relation to their bodies and body parts. If a prosthesis does not evoke a sense of embodiment, users are less likely to view them as useful and integrated with their bodies. Currently, visual feedback is the only option for most prosthesis users to account for their augmented activities. However, for activities of daily living, such as grasping actions, haptic feedback is critically important and may improve sense of embodiment. Therefore, we are investigating how converting natural haptic feedback from the prosthetic fingertips into vibrotactile feedback administered to another location on the body may allow participants to experience haptic feedback and if and how this experience affects embodiment. While we found no differences between our experimental manipulations of feedback type, we found evidence that embodiment was not negatively impacted when switching from natural feedback to proximal vibrotactile feedback. Proximal vibrotactile feedback should be further studied and considered when designing prostheses.


Author(s):  
Ryan McColl ◽  
Ian Brown ◽  
Cory Seligman ◽  
Fabian Lim ◽  
Amer Alsaraira

This project concerns the application of haptic feedback to a virtual reality laparoscopic surgery simulator. It investigates the hardware required to display haptic forces, and the software required to generate realistic and stable haptic properties. A number of surgery-based studies are undertaken using the developed haptic device. The human sense of touch, or haptic sensory system, is investigated in the context of laparoscopic surgery, where the long laparoscopic instruments reduce haptic sensation. Nonetheless, the sense of touch plays a vital role in navigation, palpation, cutting, tissue manipulation, and pathology detection in surgery. The overall haptic effect has been decomposed into a finite number of haptic attributes. The haptic attributes of mass, friction, stiction, elasticity, and viscosity are individually modeled, validated, and applied to virtual anatomical objects in visual simulations. There are times in surgery when the view from the camera cannot be depended upon. When visual feedback is impeded, haptic feedback must be relied upon more by the surgeon. A realistic simulator should include some sort of visual impedance. Results from a simple tissue holding task suggested the inclusion of haptic feedback in a simulator aids the user when visual feedback is impeded.


Author(s):  
Bruce H. Thomas

Cartoon animation techniques have previously been used to enhance the illusion of direct manipulation in 2D graphical user interfaces. In particular, animation may be used to convey a feeling of substance to the objects being manipulated by the user. To lay a solid framework for this work, an extensive review of current applications of animation to user interfaces is presented. This chapter goes on to present an expansion of the 2D animation concepts to the domain of 3D interfaces for multimedia and virtual reality. This chapter focuses on the improvement of the legibility of users’ actions in 3D multimedia and virtual reality applications, and details animation effects to support this legibility. In particular, I present animation effects for 3D graphical object manipulation. These effects include a standard set of 3D direct manipulation operations which have been extended to include animated visual feedback to add substance, operation cues for the user and constraint visualisation. The visual feedback effects using 3D warping can substitute for haptic feedback, as in the case of the squashing of an object when pressed against a wall, or the stretching of an object to show frictional forces. Finally, a pinning effect is explored for multiple users’ manipulating a common object in a collaborative environment.


1999 ◽  
Author(s):  
Wan-Chen Wu ◽  
Cagatay Basdogan ◽  
Mandayam A. Srinivasan

Abstract Human psychophysical experiments were designed and conducted to investigate the effect of 3D perspective visual images on the visual and haptic perception of size and stiffness in multimodal virtual environments (VEs). Virtual slots of varying length and buttons of varying stiffness were displayed to the subjects, who then were asked to discriminate their size and stiffness respectively using visual and/or haptic cues. The results of the size experiments show that under vision alone, farther objects are perceived to be smaller due to perspective cues and the addition of haptic feedback reduces this visual bias. Similarly, the results of the stiffness experiments show that compliant objects that are farther are perceived to be softer when there is only haptic feedback and the addition of visual feedback reduces this haptic bias. Hence, we conclude that our visual and haptic systems compensate for each other such that the sensory information that comes from visual and haptic channels is fused in an optimal manner.


2014 ◽  
Vol 26 (5) ◽  
pp. 580-591 ◽  
Author(s):  
Robert M. Philbrick ◽  
◽  
Mark B. Colton ◽  

<div class=""abs_img""><img src=""[disp_template_path]/JRM/abst-image/00260005/06.jpg"" width=""300"" />Haptic and audio 3D feedback</div> Unmanned aerial vehicles (UAVs) have many potential applications in indoor environments. However, limited visual feedback makes it difficult to pilot UAVs in cluttered and enclosed spaces. Haptic feedback combined with visual feedback has been shown to reduce the number of collisions of UAVs in indoor environments, but has generally resulted in an increase in the mental workload of the operator. This paper investigates the potential of combining novel haptic and 3D audio feedback to provide additional information to operators of UAVs to improve performance and reduce workload. Two haptic feedback and two 3D audio feedback algorithms are presented and tested in a simulation-based human subject experiment. Operator workload is quantified using standard measures and a novel application of behavioral entropy. Experimental results indicate that 3D haptic feedback improved UAV pilot performance. Pilot workload was also improved for one of the haptic algorithms in one of the control directions (lateral). The 3D audio feedback algorithms investigated in this study neither improved nor degraded pilot performance. </span>


2009 ◽  
Vol 18 (1) ◽  
pp. 39-53 ◽  
Author(s):  
Anatole Lécuyer

This paper presents a survey of the main results obtained in the field of “pseudo-haptic feedback”: a technique meant to simulate haptic sensations in virtual environments using visual feedback and properties of human visuo-haptic perception. Pseudo-haptic feedback uses vision to distort haptic perception and verges on haptic illusions. Pseudo-haptic feedback has been used to simulate various haptic properties such as the stiffness of a virtual spring, the texture of an image, or the mass of a virtual object. This paper describes the several experiments in which these haptic properties were simulated. It assesses the definition and the properties of pseudo-haptic feedback. It also describes several virtual reality applications in which pseudo-haptic feedback has been successfully implemented, such as a virtual environment for vocational training of milling machine operations, or a medical simulator for training in regional anesthesia procedures.


2014 ◽  
Vol 233 (3) ◽  
pp. 909-925 ◽  
Author(s):  
Roland Sigrist ◽  
Georg Rauter ◽  
Laura Marchal-Crespo ◽  
Robert Riener ◽  
Peter Wolf

2014 ◽  
Vol 112 (12) ◽  
pp. 3189-3196 ◽  
Author(s):  
Chiara Bozzacchi ◽  
Robert Volcic ◽  
Fulvio Domini

Perceptual estimates of three-dimensional (3D) properties, such as the distance and depth of an object, are often inaccurate. Given the accuracy and ease with which we pick up objects, it may be expected that perceptual distortions do not affect how the brain processes 3D information for reach-to-grasp movements. Nonetheless, empirical results show that grasping accuracy is reduced when visual feedback of the hand is removed. Here we studied whether specific types of training could correct grasping behavior to perform adequately even when any form of feedback is absent. Using a block design paradigm, we recorded the movement kinematics of subjects grasping virtual objects located at different distances in the absence of visual feedback of the hand and haptic feedback of the object, before and after different training blocks with different feedback combinations (vision of the thumb and vision of thumb and index finger, with and without tactile feedback of the object). In the Pretraining block, we found systematic biases of the terminal hand position, the final grip aperture, and the maximum grip aperture like those reported in perceptual tasks. Importantly, the distance at which the object was presented modulated all these biases. In the Posttraining blocks only the hand position was partially adjusted, but final and maximum grip apertures remained unchanged. These findings show that when visual and haptic feedback are absent systematic distortions of 3D estimates affect reach-to-grasp movements in the same way as they affect perceptual estimates. Most importantly, accuracy cannot be learned, even after extensive training with feedback.


Sign in / Sign up

Export Citation Format

Share Document