scholarly journals The Effect of Dissociation between Proprioception and Vision on Perception and Grip Force Control in a Stiffness Judgment Task

2018 ◽  
Author(s):  
Stephanie Hu ◽  
Raz Leib ◽  
Ilana Nisky

AbstractOur sensorimotor system estimates stiffness to form stiffness perception, such as for choosing a ripe fruit, and to generate actions, such as to adjust grip force to avoid slippage of a scalpel during surgery. We examined how temporal manipulation of the haptic and visual feedback affect stiffness perception and grip force adjustment during a stiffness discrimination task. We used delayed force feedback and delayed visual feedback to break the natural relations between these modalities when participants tried to choose the harder spring between pairs of springs. We found that visual delay caused participants to slightly overestimate stiffness while force feedback delay caused a mixed effect on perception; for some it caused underestimation and for some overestimation of stiffness. Interestingly and in contrast to previous findings without vision, we found that participants increased the magnitude of their applied grip force for all conditions. We propose a model that suggests that this increase was a result of coupling the grip force adjustment to their proprioceptive hand position, which was the only modality which we could not delay. Our findings shed light on how the sensorimotor system combines information from different sensory modalities for perception and action. These results are important for the design of improved teleoperation systems that suffer from unavoidable delays.

2012 ◽  
Vol 36 (4) ◽  
pp. 423-429 ◽  
Author(s):  
Erik D Engeberg ◽  
Sanford Meek

Background: Upper limb amputees have no direct sense of the grip force applied by a prosthetic hand; thus, precise control of the applied grip force is difficult for amputees. Since there is little object deformation when rigid objects are grasped, it is difficult for amputees to visually gauge the applied grip force in this situation. Objectives: To determine if the applied grip force from a prosthetic hand can be visually displayed and used to more efficaciously grasp objects. Study Design: Experimental controlled trial. Methods: Force feedback is used in the control algorithm for the prosthetic hand and supplied visually to the user through a bicolor LED experimentally mounted to the thumb. Several experiments are performed by able-bodied test subjects to rate the usefulness of the additional visual feedback when manipulating a clearly visible, brittle object that can break if grasped too firmly. A hybrid force-velocity sliding mode controller is used with and without additional visual force feedback supplied to the operators. Results: Subjective evaluations and success rates from the test subjects indicate a statistically significant reduction in breaking the grasped object when using the prosthesis with the extra visual feedback. Conclusions: The additional visual force feedback can effectively facilitate the manipulation of brittle objects. Clinical relevance The novel approach of this research is the implementation of a noninvasive, effective and economic technique to visually indicate the grip force applied by a prosthetic hand to upper limb amputees. This technique provides a statistically significant improvement when handling brittle objects.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Shinya Takamuku ◽  
Hiroaki Gomi

Abstract Estimating forces acting between our hand and objects is essential for dexterous motor control. An earlier study suggested that vision contributes to the estimation by demonstrating changes in grip force pattern caused by delayed visual feedback. However, two possible vision-based force estimation processes, one based on hand position and another based on object motion, were both able to explain the effect. Here, to test each process, we examined how visual feedback of hand and object each contribute to grip force control during moving an object (mass) connected to the grip by a damped-spring. Although force applied to the hand could be estimated from its displacement, we did not find any improvements by the hand feedback. In contrast, we found that visual feedback of object motion significantly improved the synchrony between grip and load forces. Furthermore, when both feedback sources were provided, the improvement was observed only when participants were instructed to direct their attention to the object. Our results suggest that visual feedback of object motion contributes to estimation of dynamic forces involved in our actions by means of inverse dynamics computation, i.e., the estimation of force from motion, and that visual attention directed towards the object facilitates this effect.


2021 ◽  
pp. 1-63
Author(s):  
Jin Lixing ◽  
Duan Xingguang ◽  
Li Changsheng ◽  
Shi Qingxin ◽  
Wen Hao ◽  
...  

Abstract This paper presents a novel parallel architecture with seven active degrees of freedom (DOFs) for general-purpose haptic devices. The prime features of the proposed mechanism are partial decoupling, large dexterous working area, and fixed actuators. The detailed processes of design, modeling, and optimization are introduced and the performance is simulated. After that, a mechanical prototype is fabricated and tested. Results of the simulations and experiments reveal that the proposed mechanism possesses excellent performances on motion flexibility and force feedback. This paper aims to provide a remarkable solution of the general-purpose haptic device for teleoperation systems with uncertain mission in complex applications.


2006 ◽  
Vol 95 (2) ◽  
pp. 922-931 ◽  
Author(s):  
David E. Vaillancourt ◽  
Mary A. Mayka ◽  
Daniel M. Corcos

The cerebellum, parietal cortex, and premotor cortex are integral to visuomotor processing. The parameters of visual information that modulate their role in visuomotor control are less clear. From motor psychophysics, the relation between the frequency of visual feedback and force variability has been identified as nonlinear. Thus we hypothesized that visual feedback frequency will differentially modulate the neural activation in the cerebellum, parietal cortex, and premotor cortex related to visuomotor processing. We used functional magnetic resonance imaging at 3 Tesla to examine visually guided grip force control under frequent and infrequent visual feedback conditions. Control conditions with intermittent visual feedback alone and a control force condition without visual feedback were examined. As expected, force variability was reduced in the frequent compared with the infrequent condition. Three novel findings were identified. First, infrequent (0.4 Hz) visual feedback did not result in visuomotor activation in lateral cerebellum (lobule VI/Crus I), whereas frequent (25 Hz) intermittent visual feedback did. This is in contrast to the anterior intermediate cerebellum (lobule V/VI), which was consistently active across all force conditions compared with rest. Second, confirming previous observations, the parietal and premotor cortices were active during grip force with frequent visual feedback. The novel finding was that the parietal and premotor cortex were also active during grip force with infrequent visual feedback. Third, right inferior parietal lobule, dorsal premotor cortex, and ventral premotor cortex had greater activation in the frequent compared with the infrequent grip force condition. These findings demonstrate that the frequency of visual information reduces motor error and differentially modulates the neural activation related to visuomotor processing in the cerebellum, parietal cortex, and premotor cortex.


2019 ◽  
Vol 121 (4) ◽  
pp. 1398-1409 ◽  
Author(s):  
Vonne van Polanen ◽  
Robert Tibold ◽  
Atsuo Nuruki ◽  
Marco Davare

Lifting an object requires precise scaling of fingertip forces based on a prediction of object weight. At object contact, a series of tactile and visual events arise that need to be rapidly processed online to fine-tune the planned motor commands for lifting the object. The brain mechanisms underlying multisensory integration serially at transient sensorimotor events, a general feature of actions requiring hand-object interactions, are not yet understood. In this study we tested the relative weighting between haptic and visual signals when they are integrated online into the motor command. We used a new virtual reality setup to desynchronize visual feedback from haptics, which allowed us to probe the relative contribution of haptics and vision in driving participants’ movements when they grasped virtual objects simulated by two force-feedback robots. We found that visual delay changed the profile of fingertip force generation and led participants to perceive objects as heavier than when lifts were performed without visual delay. We further modeled the effect of vision on motor output by manipulating the extent to which delayed visual events could bias the force profile, which allowed us to determine the specific weighting the brain assigns to haptics and vision. Our results show for the first time how visuo-haptic integration is processed at discrete sensorimotor events for controlling object-lifting dynamics and further highlight the organization of multisensory signals online for controlling action and perception. NEW & NOTEWORTHY Dexterous hand movements require rapid integration of information from different senses, in particular touch and vision, at different key time points as movement unfolds. The relative weighting between vision and haptics for object manipulation is unknown. We used object lifting in virtual reality to desynchronize visual and haptic feedback and find out their relative weightings. Our findings shed light on how rapid multisensory integration is processed over a series of discrete sensorimotor control points.


2005 ◽  
Vol 14 (6) ◽  
pp. 677-696 ◽  
Author(s):  
Christoph W. Borst ◽  
Richard A. Volz

We present a haptic feedback technique that combines feedback from a portable force-feedback glove with feedback from direct contact with rigid passive objects. This approach is a haptic analogue of visual mixed reality, since it can be used to haptically combine real and virtual elements in a single display. We discuss device limitations that motivated this combined approach and summarize technological challenges encountered. We present three experiments to evaluate the approach for interactions with buttons and sliders on a virtual control panel. In our first experiment, this approach resulted in better task performance and better subjective ratings than the use of only a force-feedback glove. In our second experiment, visual feedback was degraded and the combined approach resulted in better performance than the glove-only approach and in better ratings of slider interactions than both glove-only and passive-only approaches. A third experiment allowed subjective comparison of approaches and provided additional evidence that the combined approach provides the best experience.


Author(s):  
Andrew Erwin ◽  
Fabrizio Sergi ◽  
Vinay Chawda ◽  
Marcia K. O’Malley

This paper investigates the possibility of implementing force-feedback controllers using measurement of interaction force obtained through force-sensing resistors (FSRs), to improve performance of human interacting robots. A custom sensorized handle was developed, with the capability of simultaneously measuring grip force and interaction force during robot-aided rehabilitation therapy. Experiments are performed in order to assess the suitability of FSRs to implement force-feedback interaction controllers. In the force-feedback control condition, the applied force for constant speed motion of a linear 1DOF haptic interface is reduced 6.1 times compared to the uncontrolled condition, thus demonstrating the possibility of improving transparency through force-feedback via FSRs.


1999 ◽  
Vol 125 (3) ◽  
pp. 281-286 ◽  
Author(s):  
J. D. Connolly ◽  
M. A. Goodale

Sign in / Sign up

Export Citation Format

Share Document