scholarly journals Non-visual self-motion information influences the perception of object motion while walking

2011 ◽  
Vol 11 (11) ◽  
pp. 898-898
Author(s):  
M. Parade ◽  
J. S. Matthis ◽  
B. R. Fajen
2010 ◽  
Vol 8 (6) ◽  
pp. 1155-1155
Author(s):  
J. Saunders ◽  
F. Durgin

2017 ◽  
Vol 17 (10) ◽  
pp. 211
Author(s):  
Jonathan Matthis ◽  
Karl Muller ◽  
Kathryn Bonnen ◽  
Mary Hayhoe

2019 ◽  
Vol 19 (10) ◽  
pp. 294a
Author(s):  
Scott T Steinmetz ◽  
Oliver W Layton ◽  
N. Andrew Browning ◽  
Nathaniel V Powell ◽  
Brett R Fajen

2018 ◽  
Vol 115 (7) ◽  
pp. E1637-E1646 ◽  
Author(s):  
Tale L. Bjerknes ◽  
Nenitha C. Dagslott ◽  
Edvard I. Moser ◽  
May-Britt Moser

Place cells in the hippocampus and grid cells in the medial entorhinal cortex rely on self-motion information and path integration for spatially confined firing. Place cells can be observed in young rats as soon as they leave their nest at around 2.5 wk of postnatal life. In contrast, the regularly spaced firing of grid cells develops only after weaning, during the fourth week. In the present study, we sought to determine whether place cells are able to integrate self-motion information before maturation of the grid-cell system. Place cells were recorded on a 200-cm linear track while preweaning, postweaning, and adult rats ran on successive trials from a start wall to a box at the end of a linear track. The position of the start wall was altered in the middle of the trial sequence. When recordings were made in complete darkness, place cells maintained fields at a fixed distance from the start wall regardless of the age of the animal. When lights were on, place fields were determined primarily by external landmarks, except at the very beginning of the track. This shift was observed in both young and adult animals. The results suggest that preweaning rats are able to calculate distances based on information from self-motion before the grid-cell system has matured to its full extent.


2003 ◽  
Vol 90 (2) ◽  
pp. 723-730 ◽  
Author(s):  
Kai V. Thilo ◽  
Andreas Kleinschmidt ◽  
Michael A. Gresty

In a previous functional neuroimaging study we found that early visual areas deactivated when a rotating optical flow stimulus elicited the illusion of self-motion (vection) compared with when it was perceived as a moving object. Here, we investigated whether electrical cortical responses to an independent central visual probe stimulus change as a function of whether optical flow stimulation in the periphery induces the illusion of self-motion or not. Visual-evoked potentials (VEPs) were obtained in response to pattern-reversals in the central visual field in the presence of a constant peripheral large-field optokinetic stimulus that rotated around the naso-occipital axis and induced intermittent sensations of vection. As control, VEPs were also recorded during a stationary peripheral stimulus and showed no difference than those obtained during optokinetic stimulation. The VEPs during constant peripheral stimulation were then divided into two groups according to the time spans where the subjects reported object- or self-motion, respectively. The N70 VEP component showed a significant amplitude reduction when, due to the peripheral stimulus, subjects experienced self-motion compared to when the peripheral stimulus was perceived as object-motion. This finding supplements and corroborates our recent evidence from functional neuroimaging that early visual cortex deactivates when a visual flow stimulus elicits the illusion of self-motion compared with when the same sensory input is interpreted as object-motion. This dampened responsiveness might reflect a redistribution of sensorial and attentional resources when the monitoring of self-motion relies on a sustained and veridical processing of optic flow and may be compromised by other sources of visual input.


2012 ◽  
Vol 108 (6) ◽  
pp. 1685-1694 ◽  
Author(s):  
Lionel Bringoux ◽  
Jean-Claude Lepecq ◽  
Frédéric Danion

Accurate control of grip force during object manipulation is necessary to prevent the object from slipping, especially to compensate for the action of gravitational and inertial forces resulting from hand/object motion. The goal of the current study was to assess whether the control of grip force was influenced by visually induced self-motion (i.e., vection), which would normally be accompanied by changes in object load. The main task involved holding a 400-g object between the thumb and the index finger while being seated within a virtual immersive environment that simulated the vertical motion of an elevator across floors. Different visual motions were tested, including oscillatory (0.21 Hz) and constant-speed displacements of the virtual scene. Different arm-loading conditions were also tested: with or without the hand-held object and with or without oscillatory arm motion (0.9 Hz). At the perceptual level, ratings from participants showed that both oscillatory and constant-speed motion of the elevator rapidly induced a long-lasting sensation of self-motion. At the sensorimotor level, vection compellingness altered arm movement control. Spectral analyses revealed that arm motion was entrained by the oscillatory motion of the elevator. However, we found no evidence that grip force used to hold the object was visually affected. Specifically, spectral analyses revealed no component in grip force that would mirror the virtual change in object load associated with the oscillatory motion of the elevator, thereby allowing the grip-to-load force coupling to remain unaffected. Altogether, our findings show that the neural mechanisms underlying vection interfere with arm movement control but do not interfere with the delicate modulation of grip force. More generally, those results provide evidence that the strength of the coupling between the sensorimotor system and the perceptual level can be modulated depending on the effector.


2019 ◽  
Vol 116 (18) ◽  
pp. 9060-9065 ◽  
Author(s):  
Kalpana Dokka ◽  
Hyeshin Park ◽  
Michael Jansen ◽  
Gregory C. DeAngelis ◽  
Dora E. Angelaki

The brain infers our spatial orientation and properties of the world from ambiguous and noisy sensory cues. Judging self-motion (heading) in the presence of independently moving objects poses a challenging inference problem because the image motion of an object could be attributed to movement of the object, self-motion, or some combination of the two. We test whether perception of heading and object motion follows predictions of a normative causal inference framework. In a dual-report task, subjects indicated whether an object appeared stationary or moving in the virtual world, while simultaneously judging their heading. Consistent with causal inference predictions, the proportion of object stationarity reports, as well as the accuracy and precision of heading judgments, depended on the speed of object motion. Critically, biases in perceived heading declined when the object was perceived to be moving in the world. Our findings suggest that the brain interprets object motion and self-motion using a causal inference framework.


1983 ◽  
Vol 27 (12) ◽  
pp. 996-1000
Author(s):  
Dean H. Owen ◽  
Lawrence J. Hettinger ◽  
Shirley B. Tobias ◽  
Lawrence Wolpert ◽  
Rik Warren

Several methods are presented for breaking linkages among global optical flow and texture variables in order to assess their usefulness in experiments requiring observers to distinguish change in speed or heading of simulated self motion from events representing constant speed or level flight. Results of a series of studies testing for sensitivity to flow acceleration or deceleration, flow-pattern expansion variables, and the distribution of optical texture density are presented. Theoretical implications for determining the metrics of visual self-motion information, and practical relevance for pilot and flight simulator evaluation and for low-level, high-speed flight are discussed.


Sign in / Sign up

Export Citation Format

Share Document