scholarly journals Self-regulating neural mechanisms for self-motion estimation from optic flow

2020 ◽  
Vol 20 (11) ◽  
pp. 1212
Author(s):  
Scott Steinmetz ◽  
Oliver Layton ◽  
Nathaniel Powell ◽  
Brett Fajen
2018 ◽  
Vol 31 (7) ◽  
pp. 645-674 ◽  
Author(s):  
Maria Gallagher ◽  
Elisa Raffaella Ferrè

Abstract In the past decade, there has been a rapid advance in Virtual Reality (VR) technology. Key to the user’s VR experience are multimodal interactions involving all senses. The human brain must integrate real-time vision, hearing, vestibular and proprioceptive inputs to produce the compelling and captivating feeling of immersion in a VR environment. A serious problem with VR is that users may develop symptoms similar to motion sickness, a malady called cybersickness. At present the underlying cause of cybersickness is not yet fully understood. Cybersickness may be due to a discrepancy between the sensory signals which provide information about the body’s orientation and motion: in many VR applications, optic flow elicits an illusory sensation of motion which tells users that they are moving in a certain direction with certain acceleration. However, since users are not actually moving, their proprioceptive and vestibular organs provide no cues of self-motion. These conflicting signals may lead to sensory discrepancies and eventually cybersickness. Here we review the current literature to develop a conceptual scheme for understanding the neural mechanisms of cybersickness. We discuss an approach to cybersickness based on sensory cue integration, focusing on the dynamic re-weighting of visual and vestibular signals for self-motion.


Sensors ◽  
2021 ◽  
Vol 21 (24) ◽  
pp. 8217
Author(s):  
Oliver W. Layton

Most algorithms for steering, obstacle avoidance, and moving object detection rely on accurate self-motion estimation, a problem animals solve in real time as they navigate through diverse environments. One biological solution leverages optic flow, the changing pattern of motion experienced on the eye during self-motion. Here I present ARTFLOW, a biologically inspired neural network that learns patterns in optic flow to encode the observer’s self-motion. The network combines the fuzzy ART unsupervised learning algorithm with a hierarchical architecture based on the primate visual system. This design affords fast, local feature learning across parallel modules in each network layer. Simulations show that the network is capable of learning stable patterns from optic flow simulating self-motion through environments of varying complexity with only one epoch of training. ARTFLOW trains substantially faster and yields self-motion estimates that are far more accurate than a comparable network that relies on Hebbian learning. I show how ARTFLOW serves as a generative model to predict the optic flow that corresponds to neural activations distributed across the network.


2010 ◽  
Vol 103 (4) ◽  
pp. 1865-1873 ◽  
Author(s):  
Tao Zhang ◽  
Kenneth H. Britten

The ventral intraparietal area (VIP) of the macaque monkey is thought to be involved in judging heading direction based on optic flow. We recorded neuronal discharges in VIP while monkeys were performing a two-alternative, forced-choice heading discrimination task to relate quantitatively the activity of VIP neurons to monkeys' perceptual choices. Most VIP neurons were responsive to simulated heading stimuli and were tuned such that their responses changed across a range of forward trajectories. Using receiver operating characteristic (ROC) analysis, we found that most VIP neurons were less sensitive to small heading changes than was the monkey, although a minority of neurons were equally sensitive. Pursuit eye movements modestly yet significantly increased both neuronal and behavioral thresholds by approximately the same amount. Our results support the view that VIP activity is involved in self-motion judgments.


i-Perception ◽  
2021 ◽  
Vol 12 (6) ◽  
pp. 204166952110557
Author(s):  
Diederick C. Niehorster

The concept of optic flow, a global pattern of visual motion that is both caused by and signals self-motion, is canonically ascribed to James Gibson's 1950 book “ The Perception of the Visual World.” There have, however, been several other developments of this concept, chiefly by Gwilym Grindley and Edward Calvert. Based on rarely referenced scientific literature and archival research, this article describes the development of the concept of optic flow by the aforementioned authors and several others. The article furthermore presents the available evidence for interactions between these authors, focusing on whether parts of Gibson's proposal were derived from the work of Grindley or Calvert. While Grindley's work may have made Gibson aware of the geometrical facts of optic flow, Gibson's work is not derivative of Grindley's. It is furthermore shown that Gibson only learned of Calvert's work in 1956, almost a decade after Gibson first published his proposal. In conclusion, the development of the concept of optic flow presents an intriguing example of convergent thought in the progress of science.


2021 ◽  
Vol 118 (32) ◽  
pp. e2106235118
Author(s):  
Reuben Rideaux ◽  
Katherine R. Storrs ◽  
Guido Maiello ◽  
Andrew E. Welchman

Sitting in a static railway carriage can produce illusory self-motion if the train on an adjoining track moves off. While our visual system registers motion, vestibular signals indicate that we are stationary. The brain is faced with a difficult challenge: is there a single cause of sensations (I am moving) or two causes (I am static, another train is moving)? If a single cause, integrating signals produces a more precise estimate of self-motion, but if not, one cue should be ignored. In many cases, this process of causal inference works without error, but how does the brain achieve it? Electrophysiological recordings show that the macaque medial superior temporal area contains many neurons that encode combinations of vestibular and visual motion cues. Some respond best to vestibular and visual motion in the same direction (“congruent” neurons), while others prefer opposing directions (“opposite” neurons). Congruent neurons could underlie cue integration, but the function of opposite neurons remains a puzzle. Here, we seek to explain this computational arrangement by training a neural network model to solve causal inference for motion estimation. Like biological systems, the model develops congruent and opposite units and recapitulates known behavioral and neurophysiological observations. We show that all units (both congruent and opposite) contribute to motion estimation. Importantly, however, it is the balance between their activity that distinguishes whether visual and vestibular cues should be integrated or separated. This explains the computational purpose of puzzling neural representations and shows how a relatively simple feedforward network can solve causal inference.


2017 ◽  
Vol 17 (10) ◽  
pp. 211
Author(s):  
Jonathan Matthis ◽  
Karl Muller ◽  
Kathryn Bonnen ◽  
Mary Hayhoe

2020 ◽  
Vol 117 (27) ◽  
pp. 16065-16071 ◽  
Author(s):  
Yuli Wu ◽  
Kepu Chen ◽  
Yuting Ye ◽  
Tao Zhang ◽  
Wen Zhou

Human navigation relies on inputs to our paired eyes and ears. Although we also have two nasal passages, there has been little empirical indication that internostril differences yield directionality in human olfaction without involving the trigeminal system. By using optic flow that captures the pattern of apparent motion of surface elements in a visual scene, we demonstrate through formal psychophysical testing that a moderate binaral concentration disparity of a nontrigeminal odorant consistently biases recipients’ perceived direction of self-motion toward the higher-concentration side, despite that they cannot verbalize which nostril smells a stronger odor. We further show that the effect depends on the internostril ratio of odor concentrations and not the numeric difference in concentration between the two nostrils. Taken together, our findings provide behavioral evidence that humans smell in stereo and subconsciously utilize stereo olfactory cues in spatial navigation.


1998 ◽  
Vol 79 (3) ◽  
pp. 1461-1480 ◽  
Author(s):  
Markus Lappe ◽  
Martin Pekel ◽  
Klaus-Peter Hoffmann

Lappe, Markus, Martin Pekel, and Klaus-Peter Hoffmann. Optokinetic eye movements elicited by radial optic flow in the macaque monkey. J. Neurophysiol. 79: 1461–1480, 1998. We recorded spontaneous eye movements elicited by radial optic flow in three macaque monkeys using the scleral search coil technique. Computer-generated stimuli simulated forward or backward motion of the monkey with respect to a number of small illuminated dots arranged on a virtual ground plane. We wanted to see whether optokinetic eye movements are induced by radial optic flow stimuli that simulate self-movement, quantify their parameters, and consider their effects on the processing of optic flow. A regular pattern of interchanging fast and slow eye movements with a frequency of 2 Hz was observed. When we shifted the horizontal position of the focus of expansion (FOE) during simulated forward motion (expansional optic flow), median horizontal eye position also shifted in the same direction but only by a smaller amount; for simulated backward motion (contractional optic flow), median eye position shifted in the opposite direction. We relate this to a change in Schlagfeld typically observed in optokinetic nystagmus. Direction and speed of slow phase eye movements were compared with the local flow field motion in gaze direction (the foveal flow). Eye movement direction matched well the foveal motion. Small systematic deviations could be attributed to an integration of the global motion pattern. Eye speed on average did not match foveal stimulus speed, as the median gain was only ∼0.5–0.6. The gain was always lower for expanding than for contracting stimuli. We analyzed the time course of the eye movement immediately after each saccade. We found remarkable differences in the initial development of gain and directional following for expansion and contraction. For expansion, directional following and gain were initially poor and strongly influenced by the ongoing eye movement before the saccade. This was not the case for contraction. These differences also can be linked to properties of the optokinetic system. We conclude that optokinetic eye movements can be elicited by radial optic flow fields simulating self-motion. These eye movements are linked to the parafoveal flow field, i.e., the motion in the direction of gaze. In the retinal projection of the optic flow, such eye movements superimpose retinal slip. This results in complex retinal motion patterns, especially because the gain of the eye movement is small and variable. This observation has special relevance for mechanisms that determine self-motion from retinal flow fields. It is necessary to consider the influence of eye movements in optic flow analysis, but our results suggest that direction and speed of an eye movement should be treated differently.


2019 ◽  
Vol 32 (3) ◽  
pp. 165-178 ◽  
Author(s):  
Mathieu Koppen ◽  
Arjan C. ter Horst ◽  
W. Pieter Medendorp

Abstract When walking or driving, it is of the utmost importance to continuously track the spatial relationship between objects in the environment and the moving body in order to prevent collisions. Although this process of spatial updating occurs naturally, it involves the processing of a myriad of noisy and ambiguous sensory signals. Here, using a psychometric approach, we investigated the integration of visual optic flow and vestibular cues in spatially updating a remembered target position during a linear displacement of the body. Participants were seated on a linear sled, immersed in a stereoscopic virtual reality environment. They had to remember the position of a target, briefly presented before a sideward translation of the body involving supra-threshold vestibular cues and whole-field optic flow that provided slightly discrepant motion information. After the motion, using a forced response participants indicated whether the location of a brief visual probe was left or right of the remembered target position. Our results show that in a spatial updating task involving passive linear self-motion humans integrate optic flow and vestibular self-displacement information according to a weighted-averaging process with, across subjects, on average about four times as much weight assigned to the visual compared to the vestibular contribution (i.e., 79% visual weight). We discuss our findings with respect to previous literature on the effect of optic flow on spatial updating performance.


Sign in / Sign up

Export Citation Format

Share Document