scholarly journals Afferent connections of cytoarchitectural area 6M and surrounding cortex in the marmoset: putative homologues of the supplementary and pre-supplementary motor areas

2021 ◽  
Author(s):  
Sophia Bakola ◽  
Kathleen J Burman ◽  
Sylwia Bednarek ◽  
Jonathan M Chan ◽  
Natalia Jermakov ◽  
...  

Cortical projections to the caudomedial frontal cortex were studied using retrograde tracers in marmosets. We tested the hypothesis that cytoarchitectural area 6M includes homologues of the supplementary and pre-supplementary motor areas (SMA and preSMA) of other primates. We found that, irrespective of the injection sites' location within 6M, over half of the labeled neurons were located in motor and premotor areas. Other connections originated in prefrontal area 8b, ventral anterior and posterior cingulate areas, somatosensory areas (3a and 1-2), and areas on the rostral aspect of the dorsal posterior parietal cortex. Although the origin of afferents was similar, injections in rostral 6M received higher percentages of prefrontal afferents, and fewer somatosensory afferents, compared to caudal injections, compatible with differentiation into SMA and preSMA. Injections rostral to 6M (area 8b) revealed a very different set of connections, with increased emphasis in prefrontal and posterior cingulate afferents, and fewer parietal afferents. The connections of 6M were also quantitatively different from those of M1, dorsal premotor areas, and cingulate motor area 24d. These results show that the cortical motor control circuit is conserved in simian primates, indicating that marmosets can be valuable models for studying movement planning and control.

2022 ◽  
Vol 73 (1) ◽  
pp. 131-158
Author(s):  
Richard A. Andersen ◽  
Tyson Aflalo ◽  
Luke Bashford ◽  
David Bjånes ◽  
Spencer Kellis

Traditional brain–machine interfaces decode cortical motor commands to control external devices. These commands are the product of higher-level cognitive processes, occurring across a network of brain areas, that integrate sensory information, plan upcoming motor actions, and monitor ongoing movements. We review cognitive signals recently discovered in the human posterior parietal cortex during neuroprosthetic clinical trials. These signals are consistent with small regions of cortex having a diverse role in cognitive aspects of movement control and body monitoring, including sensorimotor integration, planning, trajectory representation, somatosensation, action semantics, learning, and decision making. These variables are encoded within the same population of cells using structured representations that bind related sensory and motor variables, an architecture termed partially mixed selectivity. Diverse cognitive signals provide complementary information to traditional motor commands to enable more natural and intuitive control of external devices.


2003 ◽  
Vol 12 (4) ◽  
pp. 387-410 ◽  
Author(s):  
Douglas A. Reece

We have developed a movement behavior model for soldier agents who populate a virtual battlefield environment. Whereas many simulations have addressed human movement behavior before, none of them has comprehensively addressed realistic military movement at individual and unit levels. To design an appropriate movement behavior model, we found it necessary to elaborate all of the requirements on movement from the military tasks of interest, define a behavior architecture that encompasses all required movement tasks, select appropriate movement planning and control approaches in light of the requirements, and implement the planning and control algorithms with novel enhancements to achieve satisfactory results. The breadth of requirements in this problem domain makes simple behavior architectures inadequate and prevents any single planning approach from easily accomplishing all tasks. In our behavior architecture, a hierarchy of tasks is distributed over unit leaders and unit members. For movement planning, we use an A* search algorithm on a hybrid search space comprising a two-dimensional regular grid and a topological map; the plan produced is a series of waypoints annotated with posture and speed changes. Individuals control movement with reactive steering behaviors. The result is a system that can realistically plan and execute a variety of unit and individual agent movement tasks on a virtual battlefield.


2018 ◽  
Author(s):  
Bartul Mimica ◽  
Benjamin A. Dunn ◽  
Tuce Tombaz ◽  
V.P.T.N.C. Srikanth Bojja ◽  
Jonathan R. Whitlock

In order to meet physical and behavioural demands of their environments animals constantly update their body posture, but little is known about the neural signals on which this ability depends. To better understand the role of cortex in coordinating natural pose and movement, we tracked the heads and backs of freely foraging rats in 3D while recording simultaneously from posterior parietal cortex (PPC) and frontal motor cortex (M2), areas critical for spatial movement planning and navigation. Single units in both regions were tuned mainly to postural features of the head, back and neck, and much less so to their movement. Representations of the head and back were organized topographically across PPC and M2, and the tuning peaks of the cells were distributed in an efficient manner, where substantially fewer cells encoded postures that occurred more often. Postural signals in both areas were sufficiently robust to allow reconstruction of ongoing behavior with 90% accuracy. Together, these findings demonstrate that both parietal and frontal motor cortices maintain an efficient, organized representation of 3D posture during unrestrained behavior.


2020 ◽  
Author(s):  
Sumner L. Norman ◽  
David Maresca ◽  
Vasileios N. Christopoulos ◽  
Whitney S. Griggs ◽  
Charlie Demene ◽  
...  

AbstractBrain-machine interfaces (BMI) are powerful devices for restoring function to people living with paralysis. Leveraging significant advances in neurorecording technology, computational power, and understanding of the underlying neural signals, BMI have enabled severely paralyzed patients to control external devices, such as computers and robotic limbs. However, high-performance BMI currently require highly invasive recording techniques, and are thus only available to niche populations. Here, we show that a minimally invasive neuroimaging approach based on functional ultrasound (fUS) imaging can be used to detect and decode movement intention signals usable for BMI. We trained non-human primates to perform memory-guided movements while using epidural fUS imaging to record changes in cerebral blood volume from the posterior parietal cortex – a brain area important for spatial perception, multisensory integration, and movement planning. Using hemodynamic signals acquired during movement planning, we classified left-cued vs. right-cued movements, establishing the feasibility of ultrasonic BMI. These results demonstrate the ability of fUS-based neural interfaces to take advantage of the excellent spatiotemporal resolution, sensitivity, and field of view of ultrasound without breaching the dura or physically penetrating brain tissue.


2015 ◽  
Vol 114 (1) ◽  
pp. 170-183 ◽  
Author(s):  
Hanna Gertz ◽  
Katja Fiehler

Previous research on reach planning in humans has implicated a frontoparietal network, including the precuneus (PCu), a putative human homolog of the monkey parietal reach region (PRR), and the dorsal premotor cortex (PMd). Using a pro-/anti-reach task, electrophysiological studies in monkeys have demonstrated that the movement goal rather than the location of the visual cue is encoded in PRR and PMd. However, if only the effector but not the movement goal is specified (underspecified condition), the PRR and PMd have been shown to represent all potential movement goals. In this functional magnetic resonance imaging study, we investigated whether the human PCu and PMd likewise encode the movement goal, and whether these reach-related areas also engage in situations with underspecified compared with specified movement goals. By using a pro-/anti-reach task, we spatially dissociated the location of the visual cue from the location of the movement goal. In the specified conditions, pro- and anti-reaches activated similar parietal and premotor areas. In the PCu contralateral to the moving arm, we found directionally selective activation fixed to the movement goal. In the underspecified conditions, we observed activation in reach-related areas of the posterior parietal cortex, including PCu. However, the activation was substantially weaker in parietal areas and lacking in PMd. Our results suggest that human PCu encodes the movement goal rather than the location of the visual cue if the movement goal is specified and even engages in situations when only the visual cue but not the movement goal is defined.


eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Srinivas Chivukula ◽  
Carey Y Zhang ◽  
Tyson Aflalo ◽  
Matiar Jafari ◽  
Kelsie Pejsa ◽  
...  

In the human posterior parietal cortex (PPC), single units encode high-dimensional information with partially mixed representations that enable small populations of neurons to encode many variables relevant to movement planning, execution, cognition, and perception. Here, we test whether a PPC neuronal population previously demonstrated to encode visual and motor information is similarly engaged in the somatosensory domain. We recorded neurons within the PPC of a human clinical trial participant during actual touch presentation and during a tactile imagery task. Neurons encoded actual touch at short latency with bilateral receptive fields, organized by body part, and covered all tested regions. The tactile imagery task evoked body part-specific responses that shared a neural substrate with actual touch. Our results are the first neuron-level evidence of touch encoding in human PPC and its cognitive engagement during a tactile imagery task, which may reflect semantic processing, attention, sensory anticipation, or imagined touch.


2006 ◽  
Vol 96 (3) ◽  
pp. 1358-1369 ◽  
Author(s):  
Gerben Rotman ◽  
Nikolaus F. Troje ◽  
Roland S. Johansson ◽  
J. Randall Flanagan

We previously showed that, when observers watch an actor performing a predictable block-stacking task, the coordination between the observer's gaze and the actor's hand is similar to the coordination between the actor's gaze and hand. Both the observer and the actor direct gaze to forthcoming grasp and block landing sites and shift their gaze to the next grasp or landing site at around the time the hand contacts the block or the block contacts the landing site. Here we compare observers' gaze behavior in a block manipulation task when the observers did and when they did not know, in advance, which of two blocks the actor would pick up first. In both cases, observers managed to fixate the target ahead of the actor's hand and showed proactive gaze behavior. However, these target fixations occurred later, relative to the actor's movement, when observers did not know the target block in advance. In perceptual tests, in which observers watched animations of the actor reaching partway to the target and had to guess which block was the target, we found that the time at which observers were able to correctly do so was very similar to the time at which they would make saccades to the target block. Overall, our results indicate that observers use gaze in a fashion that is appropriate for hand movement planning and control. This in turn suggests that they implement representations of the manual actions required in the task and representations that direct task-specific eye movements.


Sign in / Sign up

Export Citation Format

Share Document