Adaptation of hand movements to double-step targets and to distorted visual feedback: Evidence for shared mechanisms

2012 ◽  
Vol 31 (4) ◽  
pp. 791-800 ◽  
Author(s):  
Gerd Schmitz ◽  
Otmar Bock ◽  
Valentina Grigorova ◽  
Steliana Borisova
2011 ◽  
Vol 105 (2) ◽  
pp. 846-859 ◽  
Author(s):  
Lore Thaler ◽  
Melvyn A. Goodale

Studies that have investigated how sensory feedback about the moving hand is used to control hand movements have relied on paradigms such as pointing or reaching that require subjects to acquire target locations. In the context of these target-directed tasks, it has been found repeatedly that the human sensory-motor system relies heavily on visual feedback to control the ongoing movement. This finding has been formalized within the framework of statistical optimality according to which different sources of sensory feedback are combined such as to minimize variance in sensory information during movement control. Importantly, however, many hand movements that people perform every day are not target-directed, but based on allocentric (object-centered) visual information. Examples of allocentric movements are gesture imitation, drawing, or copying. Here we tested if visual feedback about the moving hand is used in the same way to control target-directed and allocentric hand movements. The results show that visual feedback is used significantly more to reduce movement scatter in the target-directed as compared with the allocentric movement task. Furthermore, we found that differences in the use of visual feedback between target-directed and allocentric hand movements cannot be explained based on differences in uncertainty about the movement goal. We conclude that the role played by visual feedback for movement control is fundamentally different for target-directed and allocentric movements. The results suggest that current computational and neural models of sensorimotor control that are based entirely on data derived from target-directed paradigms have to be modified to accommodate performance in the allocentric tasks used in our experiments. As a consequence, the results cast doubt on the idea that models of sensorimotor control developed exclusively from data obtained in target-directed paradigms are also valid in the context of allocentric tasks, such as drawing, copying, or imitative gesturing, that characterize much of human behavior.


2013 ◽  
Vol 109 (11) ◽  
pp. 2680-2690 ◽  
Author(s):  
Sandra Sülzenbrück ◽  
Herbert Heuer

Extending the body with a tool could imply that characteristics of hand movements become characteristics of the movement of the effective part of the tool. Recent research suggests that such distal shifts are subject to boundary conditions. Here we propose the existence of three constraints: a strategy constraint, a constraint of movement characteristics, and a constraint of mode of control. We investigate their validity for the curvature of transverse movements aimed at a target while using a sliding first-order lever. Participants moved the tip of the effort arm of a real or virtual lever to control a cursor representing movements of the tip of the load arm of the lever on a monitor. With this tool, straight transverse hand movements are associated with concave curvature of the path of the tip of the tool. With terminal visual feedback and when targets were presented for the hand, hand paths were slightly concave in the absence of the dynamic transformation of the tool and slightly convex in its presence. When targets were presented for the tip of the lever, both the concave and convex curvatures of the hand paths became stronger. Finally, with continuous visual feedback of the tip of the lever, curvature of hand paths became convex and concave curvature of the paths of the tip of the lever was reduced. In addition, the effect of the dynamic transformation on curvature was attenuated. These findings support the notion that distal shifts are subject to at least the three proposed constraints.


2016 ◽  
Vol 12 (4) ◽  
Author(s):  
Łukasz Kwaśniewicz ◽  
Wiesława Kuniszyk-Jóźkowiak ◽  
Grzegorz M. Wójcik ◽  
Jolanta Masiak

AbstractThe paper describes an application that allows to use a humanoid robot as a stutterer’s assistant and therapist. Auditory and visual feedback has been used in the therapy with a humanoid robot. For this purpose, the common method of “echo” was modified. The modification is that the speaker hears delayed speech sounds uttered by the robot. The sounds of speech coming from an external microphone are captured and delayed by a computer and then, using User Datagram Protocol (UDP), sent to the robot’s system and played in its speakers. This system allows the elimination of negative feedback and external sound field’s noise. The effect of this therapy is enhanced by the fact that, in addition to the effect, relating to the action of the delayed feedback, the speaker has company during the difficult process of speaking. Visual feedback has been realized as changes in the robot’s hand movements according to the shape of the speech signal envelope and possibility of controlling speech with a metronome effect.


1987 ◽  
Vol 65 (2) ◽  
pp. 181-191 ◽  
Author(s):  
Howard N. Zelaznik ◽  
Brian Hawkins ◽  
Lorraine Kisselburgh

2009 ◽  
Vol 101 (2) ◽  
pp. 614-623 ◽  
Author(s):  
Teser Wong ◽  
Denise Y. P. Henriques

Motor control relies on multiple sources of information. To estimate the position and motion of the hand, the brain uses both vision and body-position (proprioception and kinesthesia) senses from sensors in the muscles, tendons, joints, and skin. Although performance is better when more than one sensory modality is present, visuomotor adaptation suggests that people tend to rely much more on visual information of the hand to guide their arm movements to targets, even when the visual information and kinesthetic information about the hand motion are in conflict. The aim of this study is to test whether adapting hand movements in response to false visual feedback of the hand will result in the change or recalibration of the kinesthetic sense of hand motion. The advantage of this cross-sensory recalibration would ensure on-line consistency between the senses. To test this, we mapped participants' sensitivity to tilted and curved hand paths and then examined whether adapting their hand movements in response to false visual feedback affected their felt sense of hand path. We found that participants could accurately estimate hand path directions and curvature after adapting to false visual feedback of their hand when reaching to targets. Our results suggest that although vision can override kinesthesia to recalibrate arm motor commands, it does not recalibrate the kinesthetic sense of hand path geometry.


2008 ◽  
Vol 19 (1-2) ◽  
pp. 53-57 ◽  
Author(s):  
C. Farrer ◽  
M. Bouchereau ◽  
M. Jeannerod ◽  
N. Franck

It has been hypothesized that an internal model is involved in controlling and recognizing one’s own actions (action attribution). This results from a comparison process between the predicted sensory feedback of the action and its real sensory consequences. The aim of the present study is to distinguish the respective importance of two action parameters (time and direction) on such an attribution judgment.We used a device that allows introduction of discordance between the movements actually performed and the sensory feedback displayed on a computer screen. Participants were asked to judge whether they were viewing (1) their own movements, (2) their own movements modified (spatially or temporally displaced), or (3) those of another agent (i.e, the experimenter). In fact, in all conditions they were only shown their own movements either unaltered or modified by varying amounts in space or time.Movements were only attributed to another agent when therewas a high spatial discordance between participants’ hand movements and sensory feedback. This study is the first to show that the direction of movements is a cardinal feature in action attribution, whereas temporal properties of movements play a less important role.


2021 ◽  
Author(s):  
Anouk J. de Brouwer ◽  
Miriam Spering

AbstractTo maintain accurate movements, the motor system needs to deal with errors that can occur due to inherent noise, changes in the body, or disturbances in the environment. Here, we investigated the temporal coordination of rapid corrections of the eye and hand in response to a change in visual target location during the movement. In addition to a ‘classic’ double-step task in which the target stepped to a new position, participants performed a set of modified double-step tasks in which the change in movement goal was indicated by the appearance of an additional target, or by a spatial or symbolic cue. We found that both the absolute correction latencies of the eye and hand and the relative eye-hand correction latencies were dependent on the visual characteristics of the target change, with increasingly longer latencies in tasks that required more visual and cognitive processing. Typically, the hand started correcting slightly earlier than the eye, especially when the target change was indicated by a symbolic cue, and in conditions where visual feedback of the hand position was provided during the reach. Our results indicate that the oculomotor and limb-motor system can be differentially influenced by processing requirements of the task and emphasize that temporal eye-hand coordination is flexible rather than rigid.


Sign in / Sign up

Export Citation Format

Share Document