Sensorimotor cerebral activation during optokinetic nystagmus

Neurology ◽  
1997 ◽  
Vol 49 (5) ◽  
pp. 1370-1377 ◽  
Author(s):  
Stefan F. Bucher ◽  
Marianne Dieterich ◽  
Klaus C. Seelos ◽  
Thomas Brandt

Self-motion or object motion can elicit optokinetic nystagmus (OKN), which is an integral part of dynamic spatial orientation. We used functional MR imaging during horizontal OKN to study cerebral activation patterns in sensory and ocular motor areas in 10 subjects. We found activation bilaterally in the primary visual cortex, the motion-sensitive areas in the occipitotemporal cortex (the middle temporal and medial superior temporal areas), and in areas known to control several types of saccades such as the precentral and posterior median frontal gyrus, the posterior parietal cortex, and the medial part of the superior frontal gyrus (frontal, parietal, and supplementary eye fields). Additionally, we observed cortical activation in the anterior and posterior parts of the insula and in the prefrontal cortex. Bilateral activation of subcortical structures such as the putamen, globus pallidus, caudate nucleus, and the thalamus traced the efferent pathways of OKN down to the brainstem. Functional MRI during OKN revealed a complex cerebral network of sensorimotor cortical and subcortical activation.

2009 ◽  
Vol 1164 (1) ◽  
pp. 236-238 ◽  
Author(s):  
Barry M. Seemungal ◽  
Vincenzo Rizzo ◽  
Michael A. Gresty ◽  
John C. Rothwell ◽  
Adolfo M. Bronstein

2016 ◽  
Vol 116 (4) ◽  
pp. 1885-1899 ◽  
Author(s):  
Tobias Heed ◽  
Frank T. M. Leone ◽  
Ivan Toni ◽  
W. Pieter Medendorp

It has been proposed that the posterior parietal cortex (PPC) is characterized by an effector-specific organization. However, strikingly similar functional MRI (fMRI) activation patterns have been found in the PPC for hand and foot movements. Because the fMRI signal is related to average neuronal activity, similar activation levels may result either from effector-unspecific neurons or from intermingled subsets of effector-specific neurons within a voxel. We distinguished between these possibilities using fMRI repetition suppression (RS). Participants made delayed, goal-directed eye, hand, and foot movements to visual targets. In each trial, the instructed effector was identical or different to that of the previous trial. RS effects indicated an attenuation of the fMRI signal in repeat trials. The caudal PPC was active during the delay but did not show RS, suggesting that its planning activity was effector independent. Hand and foot-specific RS effects were evident in the anterior superior parietal lobule (SPL), extending to the premotor cortex, with limb overlap in the anterior SPL. Connectivity analysis suggested information flow between the caudal PPC to limb-specific anterior SPL regions and between the limb-unspecific anterior SPL toward limb-specific motor regions. These results underline that both function and effector specificity should be integrated into a concept of PPC action representation not only on a regional but also on a fine-grained, subvoxel level.


2019 ◽  
Vol 121 (4) ◽  
pp. 1207-1221 ◽  
Author(s):  
Ryo Sasaki ◽  
Dora E. Angelaki ◽  
Gregory C. DeAngelis

Multiple areas of macaque cortex are involved in visual motion processing, but their relative functional roles remain unclear. The medial superior temporal (MST) area is typically divided into lateral (MSTl) and dorsal (MSTd) subdivisions that are thought to be involved in processing object motion and self-motion, respectively. Whereas MSTd has been studied extensively with regard to processing visual and nonvisual self-motion cues, little is known about self-motion signals in MSTl, especially nonvisual signals. Moreover, little is known about how self-motion and object motion signals interact in MSTl and how this differs from interactions in MSTd. We compared the visual and vestibular heading tuning of neurons in MSTl and MSTd using identical stimuli. Our findings reveal that both visual and vestibular heading signals are weaker in MSTl than in MSTd, suggesting that MSTl is less well suited to participate in self-motion perception than MSTd. We also tested neurons in both areas with a variety of combinations of object motion and self-motion. Our findings reveal that vestibular signals improve the separability of coding of heading and object direction in both areas, albeit more strongly in MSTd due to the greater strength of vestibular signals. Based on a marginalization technique, population decoding reveals that heading and object direction can be more effectively dissociated from MSTd responses than MSTl responses. Our findings help to clarify the respective contributions that MSTl and MSTd make to processing of object motion and self-motion, although our conclusions may be somewhat specific to the multipart moving objects that we employed. NEW & NOTEWORTHY Retinal image motion reflects contributions from both the observer’s self-motion and the movement of objects in the environment. The neural mechanisms by which the brain dissociates self-motion and object motion remain unclear. This study provides the first systematic examination of how the lateral subdivision of area MST (MSTl) contributes to dissociating object motion and self-motion. We also examine, for the first time, how MSTl neurons represent translational self-motion based on both vestibular and visual cues.


2017 ◽  
Author(s):  
Eric Avila ◽  
Kaushik J Lakshminarasimhan ◽  
Gregory C DeAngelis ◽  
Dora E Angelaki

ABSTRACTNeurons in the macaque posterior parietal cortex are known to encode the direction of self-motion. But do they also encode one’s speed? To test this, we performed neural recordings from area 7a while monkeys were passively translated or rotated at various speeds. Visual stimuli were delivered as optic flow fields and vestibular stimuli were generated by a motion platform. Under both conditions, the responses of a fraction of neurons scaled linearly with self-motion speed, and speed-selective neurons were not localized to specific layers or columns. We analyzed ensembles of simultaneously recorded neurons and found that the precision of speed representation was sufficient to support path integration over modest distances. Our findings describe a multisensory neural code for linear and angular self-motion speed in the posterior parietal cortex of the macaque brain, and suggest a potential role for this representation.


2011 ◽  
Vol 105 (1) ◽  
pp. 60-68 ◽  
Author(s):  
Brian Lee ◽  
Bijan Pesaran ◽  
Richard A. Andersen

Visual signals generated by self-motion are initially represented in retinal coordinates in the early parts of the visual system. Because this information can be used by an observer to navigate through the environment, it must be transformed into body or world coordinates at later stations of the visual-motor pathway. Neurons in the dorsal aspect of the medial superior temporal area (MSTd) are tuned to the focus of expansion (FOE) of the visual image. We performed experiments to determine whether focus tuning curves in area MSTd are represented in eye coordinates or in screen coordinates (which could be head, body, or world-centered in the head-fixed paradigm used). Because MSTd neurons adjust their FOE tuning curves during pursuit eye movements to compensate for changes in pursuit and translation speed that distort the visual image, the coordinate frame was determined while the eyes were stationary (fixed gaze or simulated pursuit conditions) and while the eyes were moving (real pursuit condition). We recorded extracellular responses from 80 MSTd neurons in two rhesus monkeys ( Macaca mulatta). We found that the FOE tuning curves of the overwhelming majority of neurons were aligned in an eye-centered coordinate frame in each of the experimental conditions [fixed gaze: 77/80 (96%); real pursuit: 77/80 (96%); simulated pursuit 74/80 (93%); t-test, P < 0.05]. These results indicate that MSTd neurons represent heading in an eye-centered coordinate frame both when the eyes are stationary and when they are moving. We also found that area MSTd demonstrates significant eye position gain modulation of response fields much like its posterior parietal neighbors.


2009 ◽  
Vol 101 (5) ◽  
pp. 2725-2732 ◽  
Author(s):  
Gregory Hickok ◽  
Kayoko Okada ◽  
John T. Serences

Processing incoming sensory information and transforming this input into appropriate motor responses is a critical and ongoing aspect of our moment-to-moment interaction with the environment. While the neural mechanisms in the posterior parietal cortex (PPC) that support the transformation of sensory inputs into simple eye or limb movements has received a great deal of empirical attention—in part because these processes are easy to study in nonhuman primates—little work has been done on sensory-motor transformations in the domain of speech. Here we used functional magnetic resonance imaging and multivariate analysis techniques to demonstrate that a region of the planum temporale (Spt) shows distinct spatial activation patterns during sensory and motor aspects of a speech task. This result suggests that just as the PPC supports sensorimotor integration for eye and limb movements, area Spt forms part of a sensory-motor integration circuit for the vocal tract.


2020 ◽  
Author(s):  
Yang Zhou ◽  
Krithika Mohan ◽  
David J. Freedman

AbstractCategorization is an essential cognitive and perceptual process for recognition and decision making. The posterior parietal cortex (PPC), particularly the lateral intraparietal (LIP) area has been suggested to transform visual feature encoding into cognitive or abstract category representations. By contrast, areas closer to sensory input, such as the middle temporal (MT) area, encode stimulus features but not more abstract categorical information during categorization tasks. Here, we compare the contributions of PPC subregions in category computation by recording neuronal activity in the medial superior temporal (MST) and LIP areas during a categorization task. MST is a core motion processing area interconnected with MT, and often considered an intermediate processing stage between MT and LIP. Here we show that MST shows robust decision-correlated category encoding and working memory encoding similar to LIP, suggesting that MST plays a substantial role in cognitive computation, extending beyond its widely recognized role in visual motion processing.


2020 ◽  
Author(s):  
Andrew S. Alexander ◽  
Janet C. Tung ◽  
G. William Chapman ◽  
Laura E. Shelley ◽  
Michael E. Hasselmo ◽  
...  

AbstractAnimals engage in a variety of navigational behaviors that require different regimes of behavioral control. In the wild, rats readily switch between foraging and more complex behaviors such as chase, wherein they pursue other rats or small prey. These tasks require vastly different tracking of multiple behaviorally-significant variables including self-motion state. It is unknown whether changes in navigational context flexibly modulate the encoding of these variables. To explore this possibility, we compared self-motion processing in the multisensory posterior parietal cortex while rats performed alternating blocks of free foraging and visual target pursuit. Animals performed the pursuit task and demonstrated predictive processing by anticipating target trajectories and intercepting them. Relative to free exploration, pursuit sessions yielded greater proportions of parietal cortex neurons with reliable sensitivity to self-motion. Multiplicative gain modulation was observed during pursuit which increased the dynamic range of tuning and led to enhanced decoding accuracy of self-motion state. We found that self-motion sensitivity in parietal cortex was history-dependent regardless of behavioral context but that the temporal window of self-motion tracking was extended during target pursuit. Finally, many self-motion sensitive neurons conjunctively tracked the position of the visual target relative to the animal in egocentric coordinates, thus providing a potential coding mechanism for the observed gain changes to self-motion signals. We conclude that posterior parietal cortex dynamically integrates behaviorally-relevant information in response to ongoing task demands.


CNS Spectrums ◽  
2000 ◽  
Vol 5 (7) ◽  
pp. 34-46 ◽  
Author(s):  
Georg Northoff

AbstractKarl Ludwig Kahlbaum originally described catatonia as a psychomotor disease that encompassed motor, affective, and behavioral symptoms. In the beginning of the 20th century, catatonia was considered to be the motoric manifestation of schizophrenia; therefore, neuropathologic research mostly focused on neuroanatomic substrates (ie, the basal ganglia underlying the generation of movements). Even though some alterations were found in basal ganglia, the findings in these subcortical structures are not consistent. Recently, there has been a reemergence of interest into researching catatonia. Brain imaging studies have shown major and specific alterations in a right hemispheric neural network that includes the medial and lateral orbitofrontal and posterior parietal cortex. This neural network may be abnormally modulated by altered functional interactions between γ-aminobutyric acid (GABA)-ergic and glutamatergic transmission. This may account for the interrelationship among motor, emotional, and behavioral alterations observed in both clinical phenomenology and the subjective experiences of patients with catatonia. Such functional interrelationships should be explored in further detail in catatonia, which may also serve as a paradigmatic model for the investigation of psychomotor and brain function in general.


Sign in / Sign up

Export Citation Format

Share Document