scholarly journals Dissociation of Self-Motion and Object Motion by Linear Population Decoding That Approximates Marginalization

2017 ◽  
Vol 37 (46) ◽  
pp. 11204-11219 ◽  
Author(s):  
Ryo Sasaki ◽  
Dora E. Angelaki ◽  
Gregory C. DeAngelis
2019 ◽  
Vol 19 (10) ◽  
pp. 294a
Author(s):  
Scott T Steinmetz ◽  
Oliver W Layton ◽  
N. Andrew Browning ◽  
Nathaniel V Powell ◽  
Brett R Fajen

2003 ◽  
Vol 90 (2) ◽  
pp. 723-730 ◽  
Author(s):  
Kai V. Thilo ◽  
Andreas Kleinschmidt ◽  
Michael A. Gresty

In a previous functional neuroimaging study we found that early visual areas deactivated when a rotating optical flow stimulus elicited the illusion of self-motion (vection) compared with when it was perceived as a moving object. Here, we investigated whether electrical cortical responses to an independent central visual probe stimulus change as a function of whether optical flow stimulation in the periphery induces the illusion of self-motion or not. Visual-evoked potentials (VEPs) were obtained in response to pattern-reversals in the central visual field in the presence of a constant peripheral large-field optokinetic stimulus that rotated around the naso-occipital axis and induced intermittent sensations of vection. As control, VEPs were also recorded during a stationary peripheral stimulus and showed no difference than those obtained during optokinetic stimulation. The VEPs during constant peripheral stimulation were then divided into two groups according to the time spans where the subjects reported object- or self-motion, respectively. The N70 VEP component showed a significant amplitude reduction when, due to the peripheral stimulus, subjects experienced self-motion compared to when the peripheral stimulus was perceived as object-motion. This finding supplements and corroborates our recent evidence from functional neuroimaging that early visual cortex deactivates when a visual flow stimulus elicits the illusion of self-motion compared with when the same sensory input is interpreted as object-motion. This dampened responsiveness might reflect a redistribution of sensorial and attentional resources when the monitoring of self-motion relies on a sustained and veridical processing of optic flow and may be compromised by other sources of visual input.


2012 ◽  
Vol 108 (6) ◽  
pp. 1685-1694 ◽  
Author(s):  
Lionel Bringoux ◽  
Jean-Claude Lepecq ◽  
Frédéric Danion

Accurate control of grip force during object manipulation is necessary to prevent the object from slipping, especially to compensate for the action of gravitational and inertial forces resulting from hand/object motion. The goal of the current study was to assess whether the control of grip force was influenced by visually induced self-motion (i.e., vection), which would normally be accompanied by changes in object load. The main task involved holding a 400-g object between the thumb and the index finger while being seated within a virtual immersive environment that simulated the vertical motion of an elevator across floors. Different visual motions were tested, including oscillatory (0.21 Hz) and constant-speed displacements of the virtual scene. Different arm-loading conditions were also tested: with or without the hand-held object and with or without oscillatory arm motion (0.9 Hz). At the perceptual level, ratings from participants showed that both oscillatory and constant-speed motion of the elevator rapidly induced a long-lasting sensation of self-motion. At the sensorimotor level, vection compellingness altered arm movement control. Spectral analyses revealed that arm motion was entrained by the oscillatory motion of the elevator. However, we found no evidence that grip force used to hold the object was visually affected. Specifically, spectral analyses revealed no component in grip force that would mirror the virtual change in object load associated with the oscillatory motion of the elevator, thereby allowing the grip-to-load force coupling to remain unaffected. Altogether, our findings show that the neural mechanisms underlying vection interfere with arm movement control but do not interfere with the delicate modulation of grip force. More generally, those results provide evidence that the strength of the coupling between the sensorimotor system and the perceptual level can be modulated depending on the effector.


2019 ◽  
Vol 116 (18) ◽  
pp. 9060-9065 ◽  
Author(s):  
Kalpana Dokka ◽  
Hyeshin Park ◽  
Michael Jansen ◽  
Gregory C. DeAngelis ◽  
Dora E. Angelaki

The brain infers our spatial orientation and properties of the world from ambiguous and noisy sensory cues. Judging self-motion (heading) in the presence of independently moving objects poses a challenging inference problem because the image motion of an object could be attributed to movement of the object, self-motion, or some combination of the two. We test whether perception of heading and object motion follows predictions of a normative causal inference framework. In a dual-report task, subjects indicated whether an object appeared stationary or moving in the virtual world, while simultaneously judging their heading. Consistent with causal inference predictions, the proportion of object stationarity reports, as well as the accuracy and precision of heading judgments, depended on the speed of object motion. Critically, biases in perceived heading declined when the object was perceived to be moving in the world. Our findings suggest that the brain interprets object motion and self-motion using a causal inference framework.


2003 ◽  
Vol 14 (4) ◽  
pp. 340-346 ◽  
Author(s):  
Mark Wexler

Although visual input is egocentric, at least some visual perceptions and representations are allocentric, that is, independent of the observer's vantage point or motion. Three experiments investigated the visual perception of three-dimensional object motion during voluntary and involuntary motion in human subjects. The results show that the motor command contributes to the objective perception of space: Observers are more likely to apply, consciously and unconsciously, spatial criteria relative to an allocentric frame of reference when they are executing voluntary head movements than while they are undergoing similar involuntary displacements (which lead to a more egocentric bias). Furthermore, details of the motor command are crucial to spatial vision, as allocentric bias decreases or disappears when self-motion and motor command do not match.


PLoS ONE ◽  
2013 ◽  
Vol 8 (2) ◽  
pp. e55446 ◽  
Author(s):  
Brett R. Fajen ◽  
Jonathan S. Matthis
Keyword(s):  

2011 ◽  
Vol 11 (11) ◽  
pp. 898-898
Author(s):  
M. Parade ◽  
J. S. Matthis ◽  
B. R. Fajen

2012 ◽  
Vol 12 (9) ◽  
pp. 246-246
Author(s):  
J.-J. Yan ◽  
B. Lorv ◽  
H. Li ◽  
H.-J. Sun

Sign in / Sign up

Export Citation Format

Share Document