reach planning
Recently Published Documents


TOTAL DOCUMENTS

36
(FIVE YEARS 3)

H-INDEX

13
(FIVE YEARS 2)

2021 ◽  
Author(s):  
Christopher Lee Striemer ◽  
Carley Borza

Damage to the temporal-parietal cortex in the right hemisphere often leads to spatial neglect – a disorder in which patients are unable to attend to sensory input from their contralesional (left) side. Neglect has been associated with both attentional and premotor deficits. That is, in addition to having difficulty with attending to the left side, patients are often slower to initiate leftward vs. rightward movements (i.e., directional hypokinesia). Previous research has indicated that a brief period of adaptation to rightward shifting prisms can reduce symptoms of neglect by adjusting the patient’s movements leftward, towards the neglected field. Although prism adaptation has been shown to reduce spatial attention deficits in patients with neglect, very little work has examined the effects of prisms on premotor symptoms. In the current study, we examined this in healthy individuals using leftward shifting prisms to induce a rightward shift in the egocentric reference frame, similar to neglect patients prior to prism adaptation. Specifically, we examined the speed with which healthy participants initiated leftward and rightward reaches (without visual feedback) prior to and following adaptation to either 17° leftward (n=16) or 17° rightward (n=15) shifting prisms. Our results indicated that, following adaptation, participants were significantly faster to initiate reaches towards targets located in the direction opposite the prism shift. That is, participants were faster to initiate reaches to right targets following leftward prism adaptation, and were faster to initiate reaches to left targets following rightward prism adaptation. Overall these results are consistent with the idea that prism adaptation can influence the speed with which a reach can be planned toward a target in the direction opposite the prism shift, possibly through altering activity in neural circuits involved in reach planning.


2019 ◽  
Vol 121 (6) ◽  
pp. 2392-2400 ◽  
Author(s):  
Romy S. Bakker ◽  
Luc P. J. Selen ◽  
W. Pieter Medendorp

In daily life, we frequently reach toward objects while our body is in motion. We have recently shown that body accelerations influence the decision of which hand to use for the reach, possibly by modulating the body-centered computations of the expected reach costs. However, head orientation relative to the body was not manipulated, and hence it remains unclear whether vestibular signals contribute in their head-based sensory frame or in a transformed body-centered reference frame to these cost calculations. To test this, subjects performed a preferential reaching task to targets at various directions while they were sinusoidally translated along the lateral body axis, with their head either aligned with the body (straight ahead) or rotated 18° to the left. As a measure of hand preference, we determined the target direction that resulted in equiprobable right/left-hand choices. Results show that head orientation affects this balanced target angle when the body is stationary but does not further modulate hand preference when the body is in motion. Furthermore, reaction and movement times were larger for reaches to the balanced target angle, resembling a competitive selection process, and were modulated by head orientation when the body was stationary. During body translation, reaction and movement times depended on the phase of the motion, but this phase-dependent modulation had no interaction with head orientation. We conclude that the brain transforms vestibular signals to body-centered coordinates at the early stage of reach planning, when the decision of hand choice is computed. NEW & NOTEWORTHY The brain takes inertial acceleration into account in computing the anticipated biomechanical costs that guide hand selection during whole body motion. Whereas these costs are defined in a body-centered, muscle-based reference frame, the otoliths detect the inertial acceleration in head-centered coordinates. By systematically manipulating head position relative to the body, we show that the brain transforms otolith signals into body-centered coordinates at an early stage of reach planning, i.e., before the decision of hand choice is computed.


2019 ◽  
Vol 64 ◽  
pp. 214-219 ◽  
Author(s):  
Michael F. Barbaro ◽  
Daniel R. Kramer ◽  
George Nune ◽  
Morgan B. Lee ◽  
Terrance Peng ◽  
...  

2018 ◽  
Vol 120 (4) ◽  
pp. 1602-1615 ◽  
Author(s):  
Anouk J. de Brouwer ◽  
Mohammed Albaghdadi ◽  
J. Randall Flanagan ◽  
Jason P. Gallivan

Successful motor performance relies on our ability to adapt to changes in the environment by learning novel mappings between motor commands and sensory outcomes. Such adaptation is thought to involve two distinct mechanisms: an implicit, error-based component linked to slow learning and an explicit, strategic component linked to fast learning and savings (i.e., faster relearning). Because behavior, at any given moment, is the resultant combination of these two processes, it has remained a challenge to parcellate their relative contributions to performance. The explicit component to visuomotor rotation (VMR) learning has recently been measured by having participants verbally report their aiming strategy used to counteract the rotation. However, this procedure has been shown to magnify the explicit component. Here we tested whether task-specific eye movements, a natural component of reach planning, but poorly studied in motor learning tasks, can provide a direct readout of the state of the explicit component during VMR learning. We show, by placing targets on a visible ring and including a delay between target presentation and reach onset, that individual differences in gaze patterns during sensorimotor learning are linked to participants’ rates of learning and their expression of savings. Specifically, we find that participants who, during reach planning, naturally fixate an aimpoint rotated away from the target location, show faster initial adaptation and readaptation 24 h later. Our results demonstrate that gaze behavior cannot only uniquely identify individuals who implement cognitive strategies during learning but also how their implementation is linked to differences in learning. NEW & NOTEWORTHY Although it is increasingly well appreciated that sensorimotor learning is driven by two separate components, an error-based process and a strategic process, it has remained a challenge to identify their relative contributions to performance. Here we demonstrate that task-specific eye movements provide a direct read-out of explicit strategies during sensorimotor learning in the presence of visual landmarks. We further show that individual differences in gaze behavior are linked to learning rate and savings.


2018 ◽  
Author(s):  
David C. Cappadocia ◽  
Simona Monaco ◽  
Ying Chen ◽  
J. Douglas Crawford

AbstractEffector-specific cortical mechanisms can be difficult to establish using fMRI, in part because low time resolution might temporally conflate different signals related to target representation, motor planning, and motor execution. Here, we used an event-related fMRI protocol and a cue-separation paradigm to temporally separate these three major sensorimotor stages for saccades vs. reaches. In each trial, subjects (N=12) 1) briefly viewed a target 4-7° left or right of midline fixation on a touchscreen, followed by an 8 second delay (effector-independent target memory phase), 2) were instructed by an auditory cue to perform a reach or a saccade, followed by a second delay of 8 seconds (effector-specific planning phase), and finally 3) were prompted to move by reaching-to-touch or performing a saccade towards the remembered target (effector-specific execution phase). Our analysis of saccade and reach activation (vs. a non-spatial control task) revealed modest effector-agnostic target memory activity (left AG, bilateral mIPS) followed by independent effector parietofrontal sites and time courses during the motor components of the task, specifically: more medial (pIPS, mIPS, M1, and PMd) activity during both reach planning and execution, and more lateral (mIPS, AG, and FEF) activity only during saccade execution. These motor activations were bilateral, with a left (contralateral) preference for reach. A conjunction analysis revealed that left mIPS and right AG, PCu, SPOC, FEF/PMv and LOTC showed activation for both saccades and reaches. Overall, effector-preference contrasts (reach vs. saccade) revealed significantly more parietofrontal activation for reaches than saccades during both planning and execution, with the exception of FEF. Cross-correlation of reach, saccade, and reach-saccade activation through time revealed correlated activation both within and across effectors in each hemisphere, but with a tendency toward higher correlations in the right hemisphere, especially between the eye and hand. These results demonstrate substantially independent but temporally correlated cortical networks for human eye, hand, and eye-hand control, that follow explicit spatiotemporal rules for effector-specific timing, medial-lateral distribution, and hemispheric lateralization.


Neuroscience ◽  
2018 ◽  
Vol 385 ◽  
pp. 47-58 ◽  
Author(s):  
Benjamin Dufour ◽  
François Thénault ◽  
Pierre-Michel Bernier

2017 ◽  
Author(s):  
Anouk J. de Brouwer ◽  
Mohammed Albaghdadi ◽  
J. Randall Flanagan ◽  
Jason P. Gallivan

AbstractSuccessful motor performance relies on our ability to adapt to changes in the environment by learning novel mappings between motor commands and sensory outcomes. Such adaptation is thought to involve two distinct mechanisms: An implicit, error-based component linked to slow learning and an explicit, strategic component linked to fast learning and savings (i.e., faster relearning). Because behaviour, at any given moment, is the resultant combination of these two processes, it has remained a challenge to parcellate their relative contributions to performance. The explicit component to visuomotor rotation (VMR) learning has recently been measured by having participants verbally report their aiming strategy used to counteract the rotation. However, this procedure has been shown to magnify the explicit component. Here we tested whether task-specific eye movements, a natural component of reach planning—but poorly studied in motor learning tasks—can provide a direct read-out of the state of the explicit component during VMR learning. We show, by placing targets on a visible ring and including a delay between target presentation and reach onset, that individual differences in gaze patterns during sensorimotor adaptation are linked to participants’ rates of learning and can predict the expression of savings. Specifically, we find that participants who, during reach planning, naturally fixate an aimpoint, rotated away from the target location, show faster initial adaptation and readaptation 24 hrs. later. Our results demonstrate that gaze behaviour can not only uniquely identify individuals who implement cognitive strategies during learning, but also how their implementation is linked to differences in learning.


2017 ◽  
Author(s):  
Artur Pilacinski ◽  
Axel Lindner

ABSTRACTGoal-directed movements of the hand are often directed straight at the target, e.g. when swatting a fly; but when drawing or avoiding obstacles, hand trajectories can also become quite complex. Studies on movement planning have largely neglected the latter case and the question of whether the same neural machinery is planning straight, saccade-like vs. complex hand trajectories. Using time-resolved fMRI during delayed response tasks we examined planning activity in human superior parietal lobule (SPL) and dorsal premotor cortex (PMd). We show that the recruitment of both areas in trajectory planning differs significantly: PMd represented both straight and complex hand trajectories while SPL only those that led straight to the target. This implies that complex and computationally demanding reach planning is governed by a frontal pathway while a parietal route could warrant an alternative and faster way to put simple plans into action.


2017 ◽  
Vol 117 (6) ◽  
pp. 2262-2268 ◽  
Author(s):  
Hanna Gertz ◽  
Dimitris Voudouris ◽  
Katja Fiehler

Tactile stimuli on moving limbs are typically attenuated during reach planning and execution. This phenomenon has been related to internal forward models that predict the sensory consequences of a movement. Tactile suppression is considered to occur due to a match between the actual and predicted sensory consequences of a movement, which might free capacities to process novel or task-relevant sensory signals. Here, we examined whether and how tactile suppression depends on the relevance of somatosensory information for reaching. Participants reached with their left or right index finger to the unseen index finger of their other hand (body target) or an unseen pad on a screen (external target). In the body target condition, somatosensory signals from the static hand were available for localizing the reach target. Vibrotactile stimuli were presented on the moving index finger before or during reaching or in a separate no-movement baseline block, and participants indicated whether they detected a stimulus. As expected, detection thresholds before or during reaching were higher compared with baseline. Tactile suppression was also stronger for reaches to body targets than external targets, as reflected by higher detection thresholds and lower precision of detectability. Moreover, detection thresholds were higher when reaching with the left than with the right hand. Our results suggest that tactile suppression is modulated by position signals from the target limb that are required to reach successfully to the own body. Moreover, limb dominance seems to affect tactile suppression, presumably due to disparate uncertainty of feedback signals from the moving limb. NEW & NOTEWORTHY Tactile suppression on a moving limb has been suggested to release computational resources for processing other relevant sensory events. In the current study, we show that tactile sensitivity on the moving limb decreases more when reaching to body targets than external targets. This indicates that tactile perception can be modulated by allocating processing capacities to movement-relevant somatosensory information at the target location. Our results contribute to understanding tactile processing and predictive mechanisms in the brain.


Author(s):  
Pierre-Michel Bernier ◽  
Kevin Whittingstall ◽  
Scott T. Grafton

Sign in / Sign up

Export Citation Format

Share Document