scholarly journals Distinct Representations of Body and Head motion are Dynamically Encoded by Purkinje cell Populations in the Macaque Cerebellum

2021 ◽  
Author(s):  
Omid A Zobeiri ◽  
Kathleen E Cullen

The ability to accurately control our posture and perceive spatial orientation during self-motion requires knowledge of the motion of both the head and body. However, whereas the vestibular sensors and nuclei directly encode head motion, no sensors directly encode body motion. Instead, the integration of vestibular and neck proprioceptive inputs is necessary to transform vestibular information into the body-centric reference frame required for postural control. The anterior vermis of the cerebellum is thought to play a key role in this transformation, yet how its Purkinje cells integrate these inputs or what information they dynamically encode during self-motion remains unknown. Here we recorded the activity of individual anterior vermis Purkinje cells in alert monkeys during passively applied whole-body, body-under-head, and head-on-body rotations. Most neurons dynamically encoded an intermediate representation of self-motion between head and body motion. Notably, these neurons responded to both vestibular and neck proprioceptive stimulation and showed considerable heterogeneity in their response dynamics. Furthermore, their vestibular responses demonstrated tuning in response to changes in head-on-body position. In contrast, a small remaining percentage of neurons sensitive only to vestibular stimulation unambiguously encoded head-in-space motion across conditions. Using a simple population model, we establish that combining responses from 40 Purkinje cells can explain the responses of their target neurons in deep cerebellar nuclei across all self-motion conditions. We propose that the observed heterogeneity in Purkinje cells underlies the cerebellum's capacity to compute the dynamic representation of body motion required to ensure accurate postural control and perceptual stability in our daily lives.

2002 ◽  
Vol 88 (1) ◽  
pp. 13-28 ◽  
Author(s):  
Marko Huterer ◽  
Kathleen E. Cullen

For frequencies >10 Hz, the vestibuloocular reflex (VOR) has been primarily investigated during passive rotations of the head on the body in humans. These prior studies suggest that eye movements lag head movements, as predicted by a 7-ms delay in the VOR reflex pathways. However, Minor and colleagues recently applied whole-body rotations of frequencies ≤15 Hz in monkeys and found that eye movements were nearly in phase with head motion across all frequencies. The goal of the present study was to determine whether VOR response dynamics actually differ significantly for whole-body versus head-on-body rotations. To address this question, we evaluated the gain and phase of the VOR induced by high-frequency oscillations of the head on the body in monkeys by directly measuring both head and eye movements using the magnetic search coil technique. A torque motor was used to rotate the heads of three Rhesus monkeys over the frequency range 5–25 Hz. Peak head velocity was held constant, first at ±50°/s and then ±100°/s. The VOR was found to be essentially compensatory across all frequencies; gains were near unity (1.1 at 5 Hz vs. 1.2 at 25 Hz), and phase lag increased only slightly with frequency (from 2° at 5 Hz to 11° at 25 Hz, a marked contrast to the 63° lag at 25 Hz predicted by a 7-ms VOR latency). Furthermore, VOR response dynamics were comparable in darkness and when viewing a target and did not vary with peak velocity. Although monkeys offered less resistance to the initial cycles of applied head motion, the gain and phase of the VOR did not vary for early versus late cycles, suggesting that an efference copy of the motor command to the neck musculature did not alter VOR response dynamics. In addition, VOR dynamics were also probed by applying transient head perturbations with much greater accelerations (peak acceleration >15,000°/s2) than have been previously employed. The VOR latency was between 5 and 6 ms, and mean gain was close to unity for two of the three animals tested. A simple linear model well described the VOR responses elicited by sinusoidal and transient head on body rotations. We conclude that the VOR is compensatory over a wide frequency range in monkeys and has similar response dynamics during passive rotation of the head on body as during passive rotation of the whole body in space.


2018 ◽  
Vol 32 (11) ◽  
pp. 961-975 ◽  
Author(s):  
Jessica Battisto ◽  
Katharina V. Echt ◽  
Steven L. Wolf ◽  
Paul Weiss ◽  
Madeleine E. Hackney

1994 ◽  
Vol 6 (2) ◽  
pp. 99-116 ◽  
Author(s):  
M. W. Oram ◽  
D. I. Perrett

Cells have been found in the superior temporal polysensory area (STPa) of the macaque temporal cortex that are selectively responsive to the sight of particular whole body movements (e.g., walking) under normal lighting. These cells typically discriminate the direction of walking and the view of the body (e.g., left profile walking left). We investigated the extent to which these cells are responsive under “biological motion” conditions where the form of the body is defined only by the movement of light patches attached to the points of limb articulation. One-third of the cells (25/72) selective for the form and motion of walking bodies showed sensitivity to the moving light displays. Seven of these cells showed only partial sensitivity to form from motion, in so far as the cells responded more to moving light displays than to moving controls but failed to discriminate body view. These seven cells exhibited directional selectivity. Eighteen cells showed statistical discrimination for both direction of movement and body view under biological motion conditions. Most of these cells showed reduced responses to the impoverished moving light stimuli compared to full light conditions. The 18 cells were thus sensitive to detailed form information (body view) from the pattern of articulating motion. Cellular processing of the global pattern of articulation was indicated by the observations that none of these cells were found sensitive to movement of individual limbs and that jumbling the pattern of moving limbs reduced response magnitude. A further 10 cells were tested for sensitivity to moving light displays of whole body actions other than walking. Of these cells 5/10 showed selectivity for form displayed by biological motion stimuli that paralleled the selectivity under normal lighting conditions. The cell responses thus provide direct evidence for neural mechanisms computing form from nonrigid motion. The selectivity of the cells was for body view, specific direction, and specific type of body motion presented by moving light displays and is not predicted by many current computational approaches to the extraction of form from motion.


2017 ◽  
Vol 118 (4) ◽  
pp. 2499-2506 ◽  
Author(s):  
A. Pomante ◽  
L. P. J. Selen ◽  
W. P. Medendorp

The vestibular system provides information for spatial orientation. However, this information is ambiguous: because the otoliths sense the gravitoinertial force, they cannot distinguish gravitational and inertial components. As a consequence, prolonged linear acceleration of the head can be interpreted as tilt, referred to as the somatogravic effect. Previous modeling work suggests that the brain disambiguates the otolith signal according to the rules of Bayesian inference, combining noisy canal cues with the a priori assumption that prolonged linear accelerations are unlikely. Within this modeling framework the noise of the vestibular signals affects the dynamic characteristics of the tilt percept during linear whole-body motion. To test this prediction, we devised a novel paradigm to psychometrically characterize the dynamic visual vertical—as a proxy for the tilt percept—during passive sinusoidal linear motion along the interaural axis (0.33 Hz motion frequency, 1.75 m/s2peak acceleration, 80 cm displacement). While subjects ( n=10) kept fixation on a central body-fixed light, a line was briefly flashed (5 ms) at different phases of the motion, the orientation of which had to be judged relative to gravity. Consistent with the model’s prediction, subjects showed a phase-dependent modulation of the dynamic visual vertical, with a subject-specific phase shift with respect to the imposed acceleration signal. The magnitude of this modulation was smaller than predicted, suggesting a contribution of nonvestibular signals to the dynamic visual vertical. Despite their dampening effect, our findings may point to a link between the noise components in the vestibular system and the characteristics of dynamic visual vertical.NEW & NOTEWORTHY A fundamental question in neuroscience is how the brain processes vestibular signals to infer the orientation of the body and objects in space. We show that, under sinusoidal linear motion, systematic error patterns appear in the disambiguation of linear acceleration and spatial orientation. We discuss the dynamics of these illusory percepts in terms of a dynamic Bayesian model that combines uncertainty in the vestibular signals with priors based on the natural statistics of head motion.


2002 ◽  
Vol 11 (6) ◽  
pp. 349-355
Author(s):  
Ognyan I. Kolev

Purpose: To further investigate the direction of (I) nystagmus and (II) self-motion perception induced by two stimuli: (a) caloric vestibular stimulations and (b) a sudden halt during vertical axis rotation. Subjects and methods: Twelve normal humans received caloric stimulation at 44°C, 30°C, and 20°C while in a supine position with the head inclined 30° upwards. In a second test they were rotated around the vertical axis with the head randomly placed in two positions: tilted 30° forward or tilted 60° backward, at a constant velocity of 90°/sec for 2 minutes and then suddenly stopped. After both tests they were asked to describe their sensations of self-motion. Eye movements were recorded with an infrared video-technique. Results: Caloric stimulation evoked only horizontal nystagmus in all subjects and induced a non-uniform complex perception of angular in frontal and transverse planes (the former dominated) and linear movements along the antero-posterior axis (sinking dominated) of the subject's coordinates. The self-motion was felt with the whole body or with a part of the body. Generally the perception evoked by cold (30°C) and warm (44°C) calorics was similar, although there were some differences. The stronger stimulus (20°C) evoked not only quantitative but also qualitative differences in perception. The abrupt halt of rotation induced self-motion perception and nystagmus only in the plane of rotation. The self-motion was felt with the whole body. Conclusion: There was no difference in the nystagmus evoked by caloric stimulation and a sudden halt of vertical axis rotation (in head positions to stimulate the horizontal canals); however, the two stimuli evoked different perceptions of self-motion. Calorics provoked the sensation of self-rotation in the frontal plane and linear motion, which did not correspond to the direction of nystagmus, as well as arcing and a reset phenomenon during angular and linear self-motion, caloric-induced self-motion can be felt predominantly or only with a part of the body, depending on the self-motion intensity. The findings indicate that, unlike the self-motion induced by sudden halt of vertical axis rotation, several mechanisms take part in generating caloric-induced self-motion.


2019 ◽  
Vol 121 (6) ◽  
pp. 2392-2400 ◽  
Author(s):  
Romy S. Bakker ◽  
Luc P. J. Selen ◽  
W. Pieter Medendorp

In daily life, we frequently reach toward objects while our body is in motion. We have recently shown that body accelerations influence the decision of which hand to use for the reach, possibly by modulating the body-centered computations of the expected reach costs. However, head orientation relative to the body was not manipulated, and hence it remains unclear whether vestibular signals contribute in their head-based sensory frame or in a transformed body-centered reference frame to these cost calculations. To test this, subjects performed a preferential reaching task to targets at various directions while they were sinusoidally translated along the lateral body axis, with their head either aligned with the body (straight ahead) or rotated 18° to the left. As a measure of hand preference, we determined the target direction that resulted in equiprobable right/left-hand choices. Results show that head orientation affects this balanced target angle when the body is stationary but does not further modulate hand preference when the body is in motion. Furthermore, reaction and movement times were larger for reaches to the balanced target angle, resembling a competitive selection process, and were modulated by head orientation when the body was stationary. During body translation, reaction and movement times depended on the phase of the motion, but this phase-dependent modulation had no interaction with head orientation. We conclude that the brain transforms vestibular signals to body-centered coordinates at the early stage of reach planning, when the decision of hand choice is computed. NEW & NOTEWORTHY The brain takes inertial acceleration into account in computing the anticipated biomechanical costs that guide hand selection during whole body motion. Whereas these costs are defined in a body-centered, muscle-based reference frame, the otoliths detect the inertial acceleration in head-centered coordinates. By systematically manipulating head position relative to the body, we show that the brain transforms otolith signals into body-centered coordinates at an early stage of reach planning, i.e., before the decision of hand choice is computed.


Perception ◽  
1988 ◽  
Vol 17 (1) ◽  
pp. 5-11 ◽  
Author(s):  
Masao Ohmi ◽  
Ian P Howard

It has previously been shown that when a moving and a stationary display are superimposed, illusory self-rotation (circular vection) is induced only when the moving display appears as the background. Three experiments are reported on the extent to which illusory forward self-motion (forward vection) induced by a looming display is inhibited by a superimposed stationary display as a function of the size and location of the stationary display and of the depth between the stationary and looming displays. Results showed that forward vection was controlled by the display that was perceived as the background, and background stationary displays suppressed forward vection by about the same amount whatever their size and eccentricity. Also, the perception of foreground — background properties of competing displays determined which controlled forward vection, and this control was not tied to specific depth cues. The inhibitory effect of a stationary background on forward vection was, however, weaker than that found with circular vection. This difference makes sense because, for forward body motion, the image of a distant scene is virtually stationary whereas, when the body rotates, it is not.


1997 ◽  
Vol 7 (4) ◽  
pp. 347-365
Author(s):  
T. Mergner ◽  
W. Huber ◽  
W. Becker

The article considers findings and concepts on vestibular-proprioceptive interaction for self-motion perception and postural control under the form of simple describing models. It points out that vestibular-neck interaction is only a small fraction of an extended mechanism of coordinate transformations. This links together the different parts of our bodies, so that sensory information arising in one part of the body can be used for perceptual or motor tasks in other parts. Particular emphasis is put on the problems that arise from imperfect signal transduction in the vestibular semicircular canal systems at low stimulus frequencies/velocities. Also, a “down-and-up-channeling” principle is suggested, by which the body support is linked via coordinate transformations to the internal notion of physical space provided by the vestibular system. Furthermore, the following question is addressed: how does the brain use visual input to overcome the vestibular deficiencies, at the risk of visual self-motion illusions? Finally, a conceptual model of postural control is presented in which a proprioceptive feedback that links the body to its support surface is merged with a loop for postural stabilization in space.


Vision ◽  
2019 ◽  
Vol 3 (2) ◽  
pp. 13
Author(s):  
Pearl Guterman ◽  
Robert Allison

When the head is tilted, an objectively vertical line viewed in isolation is typically perceived as tilted. We explored whether this shift also occurs when viewing global motion displays perceived as either object-motion or self-motion. Observers stood and lay left side down while viewing (1) a static line, (2) a random-dot display of 2-D (planar) motion or (3) a random-dot display of 3-D (volumetric) global motion. On each trial, the line orientation or motion direction were tilted from the gravitational vertical and observers indicated whether the tilt was clockwise or counter-clockwise from the perceived vertical. Psychometric functions were fit to the data and shifts in the point of subjective verticality (PSV) were measured. When the whole body was tilted, the perceived tilt of both a static line and the direction of optic flow were biased in the direction of the body tilt, demonstrating the so-called A-effect. However, we found significantly larger shifts for the static line than volumetric global motion as well as larger shifts for volumetric displays than planar displays. The A-effect was larger when the motion was experienced as self-motion compared to when it was experienced as object-motion. Discrimination thresholds were also more precise in the self-motion compared to object-motion conditions. Different magnitude A-effects for the line and motion conditions—and for object and self-motion—may be due to differences in combining of idiotropic (body) and vestibular signals, particularly so in the case of vection which occurs despite visual-vestibular conflict.


2014 ◽  
Vol 1 (3) ◽  
pp. 140185 ◽  
Author(s):  
Ludwig Wallmeier ◽  
Lutz Wiegrebe

The ability of blind humans to navigate complex environments through echolocation has received rapidly increasing scientific interest. However, technical limitations have precluded a formal quantification of the interplay between echolocation and self-motion. Here, we use a novel virtual echo-acoustic space technique to formally quantify the influence of self-motion on echo-acoustic orientation. We show that both the vestibular and proprioceptive components of self-motion contribute significantly to successful echo-acoustic orientation in humans: specifically, our results show that vestibular input induced by whole-body self-motion resolves orientation-dependent biases in echo-acoustic cues. Fast head motions, relative to the body, provide additional proprioceptive cues which allow subjects to effectively assess echo-acoustic space referenced against the body orientation. These psychophysical findings clearly demonstrate that human echolocation is well suited to drive precise locomotor adjustments. Our data shed new light on the sensory–motor interactions, and on possible optimization strategies underlying echolocation in humans.


Sign in / Sign up

Export Citation Format

Share Document