scholarly journals Postural effects on arm movement variability are idiosyncratic and feedback-dependent

2020 ◽  
Author(s):  
Preyaporn Phataraphruk ◽  
Qasim Rahman ◽  
Kishor Lakshminarayanan ◽  
Mitchell Fruchtman ◽  
Christopher A. Buneo

AbstractReaching movements are subject to noise arising during the sensing, planning and execution phases of movement production, which contributes to movement variability. When vision of the moving hand is available, reaching variability appears to be strongly influenced by noise occurring during the specification and/or online updating of movement plans in visual coordinates. In contrast, when vision of the hand is unavailable, variability appears more dependent upon hand movement direction, suggesting a greater influence of execution noise. Given that execution noise acts in part at the muscular level, we hypothesized that reaching variability should depend not only on movement direction but initial arm posture as well. Moreover, given that the effects of execution noise are more apparent when movements are performed without vision of the hand, we reasoned that postural effects would be more evident when visual feedback was withheld. To test these hypotheses, subjects planned memory-guided reaching movements to three frontal plane targets, using either an “adducted” or “abducted” initial arm posture. Movements were then executed with and without hand vision. We found that the effects of initial arm posture on movement variability were idiosyncratic in both visual feedback conditions. In addition, without visual feedback, posture-dependent differences in variability varied with movement extent, growing abruptly larger in magnitude during the terminal phases of movement, and were moderately correlated with differences in mean endpoint positions. The results emphasize the role of factors other than noise (i.e. biomechanics and suboptimal sensorimotor integration) in constraining patterns of movement variability in 3D space.

1995 ◽  
Vol 73 (1) ◽  
pp. 347-360 ◽  
Author(s):  
J. Gordon ◽  
M. F. Ghilardi ◽  
C. Ghez

1. This paper introduces a series of studies in which we analyze the impairments in a planar reaching task in human patients with severe proprioceptive deficits resulting from large-fiber sensory neuropathy. We studied three patients, all of whom showed absence of discriminative tactile sensation, position sense, and stretch reflexes in the upper extremities. Muscle strength was normal. We compared the reaching movements of the patients with those of normal control subjects. The purpose of this first paper was no characterize the spatial errors in these patients that result primarily from impairments in the planning and execution of movement rather than in feedback control. This was done by using a task in which visual feedback of errors during movement was prevented. 2. Subjects were instructed to move their hand from given starting positions of different targets on a horizontal digitizing tablet. Hand position and targets were displayed on a computer screen. Subjects could not see their hand, and the screen display of hand position was blanked at the signal to move. Thus visual feedback during movement could not be used to achieve accuracy. Movement paths were displayed as knowledge of results after each trial. 3. Compared with controls, the patients made large spatial errors in both movement direction and extent. Directional errors were evident from movement onset, suggesting that they resulted from improper planning. In addition, patients' hand paths showed large curves and secondary movements after initial stops. 4. The overall control strategy used by patients appeared the same as that used by controls. Hand trajectories were approximately bell shaped, and movement extent was controlled by scaling a trajectory waveform in amplitude and time. However, both control subjects and patients showed systematic errors in movement extent that depended on the direction of hand movement. In control subjects, these systematic dependencies of extent on direction were small, but in patients they produced large and prominent errors. Analysis of the hand trajectories revealed that errors were associated with differences in velocity and acceleration for movements in different directions. In an earlier study, we showed that in subjects with normal sensation that the dependence of acceleration and velocity on direction results from a failure to take the inertial properties of the limb into account in programming the initial trajectory. In control subjects, these differences in initial acceleration are partially compensated by direction-dependent variations in movement time.(ABSTRACT TRUNCATED AT 400 WORDS)


2011 ◽  
Vol 105 (3) ◽  
pp. 999-1010 ◽  
Author(s):  
Natalia Dounskaia ◽  
Jacob A. Goble ◽  
Wanyue Wang

The role of extrinsic and intrinsic factors in control of arm movement direction remains under debate. We addressed this question by investigating preferences in selection of movement direction and whether factors causing these preferences have extrinsic or intrinsic nature. An unconstrained free-stroke drawing task was used during which participants produced straight strokes on a horizontal table, choosing the direction and the beginning and end of each stroke arbitrarily. The variation of the initial arm postures across strokes provided a possibility to distinguish between the extrinsic and intrinsic origins of directional biases. Although participants were encouraged to produce strokes equally in all directions, each participant demonstrated preferences for some directions over the others. However, the preferred directions were not consistent across participants, suggesting no directional preferences in extrinsic space. Consistent biases toward certain directions were revealed in intrinsic space representing initial arm postures. Factors contributing to the revealed preferences were analyzed within the optimal control framework. The major bias was explained by a tendency predicted by the leading joint hypothesis (LJH) to minimize active interference with interaction torque generated by shoulder motion at the elbow. Some minor biases may represent movements of minimal inertial resistance or maximal kinematic manipulability. These results support a crucial role of intrinsic factors in control of the movement direction of the arm. Based on the LJH interpretation of the major bias, we hypothesize that the dominant tendency was to minimize neural effort for control of arm intersegmental dynamics. Possible organization of neural processes underlying optimal selection of movement direction is discussed.


2010 ◽  
Vol 103 (6) ◽  
pp. 3153-3166 ◽  
Author(s):  
Vicente Reyes-Puerta ◽  
Roland Philipp ◽  
Werner Lindner ◽  
Klaus-Peter Hoffmann

When reaching for an object, primates usually look at their target before touching it with the hand. This gaze movement prior to the arm movement allows target fixation, which is usually prolonged until the target is reached. In this manner, a stable image of the object is provided on the fovea during the reach, which is crucial for guiding the final part of the hand trajectory by visual feedback. Here we investigated a neural substrate possibly responsible for this behavior. In particular we tested the influence of reaching movements on neurons recorded at the rostral pole of the superior colliculus (rSC), an area classically related to fixation. Most rSC neurons showed a significant increase in their activity during reaching. Moreover, this increase was particularly high when the reaching movements were preceded by corresponding saccades to the targets to be reached, probably revealing a stronger coupling of the oculo-manual neural system during such a natural task. However, none of the parameters tested—including movement kinematics and target location—was found to be closely related to the observed increase in neural activity. Thus the increase in activity during reaching was found to be rather nonspecific except for its dependence on whether the reach was produced in isolation or in combination with a gaze movement. These results identify the rSC as a neural substrate sufficient for gaze anchoring during natural reaching movements, placing its activity at the core of the neural system dedicated to eye-hand coordination.


2010 ◽  
Vol 104 (5) ◽  
pp. 2654-2666 ◽  
Author(s):  
Gregory A. Apker ◽  
Timothy K. Darling ◽  
Christopher A. Buneo

Reaching movements are subject to noise in both the planning and execution phases of movement production. The interaction of these noise sources during natural movements is not well understood, despite its importance for understanding movement variability in neurologically intact and impaired individuals. Here we examined the interaction of planning and execution related noise during the production of unconstrained reaching movements. Subjects performed sequences of two movements to targets arranged in three vertical planes separated in depth. The starting position for each sequence was also varied in depth with the target plane; thus required movement sequences were largely contained within the vertical plane of the targets. Each final target in a sequence was approached from two different directions, and these movements were made with or without visual feedback of the moving hand. These combined aspects of the design allowed us to probe the interaction of execution and planning related noise with respect to reach endpoint variability. In agreement with previous studies, we found that reach endpoint distributions were highly anisotropic. The principal axes of movement variability were largely aligned with the depth axis, i.e., the axis along which visual planning related noise would be expected to dominate, and were not generally well aligned with the direction of the movement vector. Our results suggest that visual planning–related noise plays a dominant role in determining anisotropic patterns of endpoint variability in three-dimensional space, with execution noise adding to this variability in a movement direction-dependent manner.


2005 ◽  
Vol 93 (6) ◽  
pp. 3200-3213 ◽  
Author(s):  
Robert A. Scheidt ◽  
Michael A. Conditt ◽  
Emanuele L. Secco ◽  
Ferdinando A. Mussa-Ivaldi

People tend to make straight and smooth hand movements when reaching for an object. These trajectory features are resistant to perturbation, and both proprioceptive as well as visual feedback may guide the adaptive updating of motor commands enforcing this regularity. How is information from the two senses combined to generate a coherent internal representation of how the arm moves? Here we show that eliminating visual feedback of hand-path deviations from the straight-line reach (constraining visual feedback of motion within a virtual, “visual channel”) prevents compensation of initial direction errors induced by perturbations. Because adaptive reduction in direction errors occurred with proprioception alone, proprioceptive and visual information are not combined in this reaching task using a fixed, linear weighting scheme as reported for static tasks not requiring arm motion. A computer model can explain these findings, assuming that proprioceptive estimates of initial limb posture are used to select motor commands for a desired reach and visual feedback of hand-path errors brings proprioceptive estimates into registration with a visuocentric representation of limb position relative to its target. Simulations demonstrate that initial configuration estimation errors lead to movement direction errors as observed experimentally. Registration improves movement accuracy when veridical visual feedback is provided but is not invoked when hand-path errors are eliminated. However, the visual channel did not exclude adjustment of terminal movement features maximizing hand-path smoothness. Thus visual and proprioceptive feedback may be combined in fundamentally different ways during trajectory control and final position regulation of reaching movements.


Author(s):  
Xudong Zhang ◽  
Don B. Chaffin

This paper presents a new method to empirically investigate the effects of task factors on three-dimensional (3D) dynamic postures during seated reaching movements. The method relies on a statistical model in which the effects of hand location and those of various task factors on dynamic postures are distinguished. Two statistical procedures are incorporated: a regression description of the relationship between the time-varying hand location and postural profiles to compress the movement data, and a series of analyses of variance to test the hypothesized task effects using instantaneous postures with prescribed hand locations as dependent measures. The use of this method is illustrated by an experiment which examines two generic task factors: 1) hand movement direction, and 2) motion completion time. The results suggest that the hand motion direction is a significant task factor and should be included as an important attribute when describing or modeling instantaneous postures. It was also found that the time to complete a motion under a self-paced mode was significantly different from a motivated mode, but the time difference did not significantly affect instantaneous postures. The concept of an instantaneous posture and its usage in dynamic studies of movements are discussed. Some understanding of human postural control as well as the implications for developing a general dynamic posture prediction model also are presented.


2021 ◽  
Author(s):  
Alyssa Unell ◽  
Zachary M. Eisenstat ◽  
Ainsley Braun ◽  
Abhinav Gandhi ◽  
Sharon Gilad-Gutnick ◽  
...  

Towards the larger goal of understanding factors relevant for improving visuo-motor control, we investigated the role of visual feedback for modulating the effectiveness of a simple hand-eye training protocol. The regimen comprised a series of curve tracing tasks undertaken over a period of one week by neurologically healthy individuals with their non-dominant hands. Our three subject groups differed in the training they experienced: those who received ‘Persistent’ visual-feedback by seeing their hand and trace evolve in real-time superimposed upon the reference patterns, those who received ‘Non-Persistent’ visual-feedback seeing their hand movement but not the emerging trace, and a ‘Control’ group that underwent no training. Improvements in performance were evaluated along two dimensions – accuracy and steadiness, to assess visuo-motor and motor skills, respectively. We found that persistent feedback leads to a significantly greater improvement in accuracy than non-persistent feedback. Steadiness, on the other hand, benefits from training irrespective of the persistence of feedback. Our results not only demonstrate the feasibility of rapid visuo-motor learning in adulthood, but more specifically, the influence of visual veridicality and a critical role for dynamically emergent visual information.


2012 ◽  
Vol 220 (1) ◽  
pp. 10-15 ◽  
Author(s):  
Stefan Ladwig ◽  
Christine Sutter ◽  
Jochen Müsseler

When using a tool, proximal action effects (e.g., the hand movement on a digitizer tablet) and distal action effects (e.g., the cursor movement on a display) often do not correspond to or are even in conflict with each other. In the experiments reported here, we examined the role of proximal and distal action effects in a closed loop task of sensorimotor control. Different gain factors perturbed the relation between hand movements on the digitizer tablet and cursor movements on a display. In the experiments, the covert hand movement was held constant, while the cursor amplitude on the display was shorter, equal, or longer, and vice versa in the other condition. When participants were asked to replicate the hand movement without visual feedback, hand amplitudes varied in accordance with the displayed amplitudes. Adding a second transformation (Experiment 1: 90°-rotation of visual feedback, Experiment 2: 180°-rotation of visual feedback) reduced these aftereffects only when the discrepancy between hand movement and displayed movement was obvious. In conclusion, distal action effects assimilated proximal action effects when the proprioceptive/tactile feedback showed a feature overlap with the visual feedback on the display.


2012 ◽  
Vol 107 (1) ◽  
pp. 90-102 ◽  
Author(s):  
Gregory A. Apker ◽  
Christopher A. Buneo

Reaching movements are subject to noise associated with planning and execution, but precisely how these noise sources interact to determine patterns of endpoint variability in three-dimensional space is not well understood. For frontal plane movements, variability is largest along the depth axis (the axis along which visual planning noise is greatest), with execution noise contributing to this variability along the movement direction. Here we tested whether these noise sources interact in a similar way for movements directed in depth. Subjects performed sequences of two movements from a single starting position to targets that were either both contained within a frontal plane (“frontal sequences”) or where the first was within the frontal plane and the second was directed in depth (“depth sequences”). For both sequence types, movements were performed with or without visual feedback of the hand. When visual feedback was available, endpoint distributions for frontal and depth sequences were generally anisotropic, with the principal axes of variability being strongly aligned with the depth axis. Without visual feedback, endpoint distributions for frontal sequences were relatively isotropic and movement direction dependent, while those for depth sequences were similar to those with visual feedback. Overall, the results suggest that in the presence of visual feedback, endpoint variability is dominated by uncertainty associated with planning and updating visually guided movements. In addition, the results suggest that without visual feedback, increased uncertainty in hand position estimation effectively unmasks the effect of execution-related noise, resulting in patterns of endpoint variability that are highly movement direction dependent.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Alyssa Unell ◽  
Zachary M. Eisenstat ◽  
Ainsley Braun ◽  
Abhinav Gandhi ◽  
Sharon Gilad-Gutnick ◽  
...  

AbstractTowards the larger goal of understanding factors relevant for improving visuo-motor control, we investigated the role of visual feedback for modulating the effectiveness of a simple hand-eye training protocol. The regimen comprised a series of curve tracing tasks undertaken over a period of one week by neurologically healthy individuals with their non-dominant hands. Our three subject groups differed in the training they experienced: those who received ‘Persistent’ visual-feedback by seeing their hand and trace evolve in real-time superimposed upon the reference patterns, those who received ‘Non-Persistent’ visual-feedback seeing their hand movement but not the emerging trace, and a ‘Control’ group that underwent no training. Improvements in performance were evaluated along two dimensions—accuracy and steadiness, to assess visuo-motor and motor skills, respectively. We found that persistent feedback leads to a significantly greater improvement in accuracy than non-persistent feedback. Steadiness, on the other hand, benefits from training irrespective of the persistence of feedback. Our results not only demonstrate the feasibility of rapid visuo-motor learning in adulthood, but more specifically, the influence of visual veridicality and a critical role for dynamically emergent visual information.


Sign in / Sign up

Export Citation Format

Share Document