scholarly journals Common causation and offset effects in human visual-inertial heading direction integration

2020 ◽  
Vol 123 (4) ◽  
pp. 1369-1379
Author(s):  
Raul Rodriguez ◽  
Benjamin T. Crane

Movement direction can be determined from a combination of visual and inertial cues. Visual motion (optic flow) can represent self-motion through a fixed environment or environmental motion relative to an observer. Simultaneous visual and inertial heading cues present the question of whether the cues have a common cause (i.e., should be integrated) or whether they should be considered independent. This was studied in eight healthy human subjects who experienced 12 visual and inertial headings in the horizontal plane divided in 30° increments. The headings were estimated in two unisensory and six multisensory trial blocks. Each unisensory block included 72 stimulus presentations, while each multisensory block included 144 stimulus presentations, including every possible combination of visual and inertial headings in random order. After each multisensory stimulus, subjects reported their perception of visual and inertial headings as congruous (i.e., having common causation) or not. In the multisensory trial blocks, subjects also reported visual or inertial heading direction (3 trial blocks for each). For aligned visual-inertial headings, the rate of common causation was higher during alignment in cardinal than noncardinal directions. When visual and inertial stimuli were separated by 30°, the rate of reported common causation remained >50%, but it decreased to 15% or less for separation of ≥90°. The inertial heading was biased toward the visual heading by 11–20° for separations of 30–120°. Thus there was sensory integration even in conditions without reported common causation. The visual heading was minimally influenced by inertial direction. When trials with common causation perception were compared with those without, inertial heading perception had a stronger bias toward visual stimulus direction. NEW & NOTEWORTHY Optic flow ambiguously represents self-motion or environmental motion. When these are in different directions, it is uncertain whether these are integrated into a common perception or not. This study looks at that issue by determining whether the two modalities are consistent and by measuring their perceived directions to get a degree of influence. The visual stimulus can have significant influence on the inertial stimulus even when they are perceived as inconsistent.

2010 ◽  
Vol 103 (4) ◽  
pp. 1865-1873 ◽  
Author(s):  
Tao Zhang ◽  
Kenneth H. Britten

The ventral intraparietal area (VIP) of the macaque monkey is thought to be involved in judging heading direction based on optic flow. We recorded neuronal discharges in VIP while monkeys were performing a two-alternative, forced-choice heading discrimination task to relate quantitatively the activity of VIP neurons to monkeys' perceptual choices. Most VIP neurons were responsive to simulated heading stimuli and were tuned such that their responses changed across a range of forward trajectories. Using receiver operating characteristic (ROC) analysis, we found that most VIP neurons were less sensitive to small heading changes than was the monkey, although a minority of neurons were equally sensitive. Pursuit eye movements modestly yet significantly increased both neuronal and behavioral thresholds by approximately the same amount. Our results support the view that VIP activity is involved in self-motion judgments.


i-Perception ◽  
2021 ◽  
Vol 12 (6) ◽  
pp. 204166952110557
Author(s):  
Diederick C. Niehorster

The concept of optic flow, a global pattern of visual motion that is both caused by and signals self-motion, is canonically ascribed to James Gibson's 1950 book “ The Perception of the Visual World.” There have, however, been several other developments of this concept, chiefly by Gwilym Grindley and Edward Calvert. Based on rarely referenced scientific literature and archival research, this article describes the development of the concept of optic flow by the aforementioned authors and several others. The article furthermore presents the available evidence for interactions between these authors, focusing on whether parts of Gibson's proposal were derived from the work of Grindley or Calvert. While Grindley's work may have made Gibson aware of the geometrical facts of optic flow, Gibson's work is not derivative of Grindley's. It is furthermore shown that Gibson only learned of Calvert's work in 1956, almost a decade after Gibson first published his proposal. In conclusion, the development of the concept of optic flow presents an intriguing example of convergent thought in the progress of science.


2021 ◽  
Vol 118 (32) ◽  
pp. e2106235118
Author(s):  
Reuben Rideaux ◽  
Katherine R. Storrs ◽  
Guido Maiello ◽  
Andrew E. Welchman

Sitting in a static railway carriage can produce illusory self-motion if the train on an adjoining track moves off. While our visual system registers motion, vestibular signals indicate that we are stationary. The brain is faced with a difficult challenge: is there a single cause of sensations (I am moving) or two causes (I am static, another train is moving)? If a single cause, integrating signals produces a more precise estimate of self-motion, but if not, one cue should be ignored. In many cases, this process of causal inference works without error, but how does the brain achieve it? Electrophysiological recordings show that the macaque medial superior temporal area contains many neurons that encode combinations of vestibular and visual motion cues. Some respond best to vestibular and visual motion in the same direction (“congruent” neurons), while others prefer opposing directions (“opposite” neurons). Congruent neurons could underlie cue integration, but the function of opposite neurons remains a puzzle. Here, we seek to explain this computational arrangement by training a neural network model to solve causal inference for motion estimation. Like biological systems, the model develops congruent and opposite units and recapitulates known behavioral and neurophysiological observations. We show that all units (both congruent and opposite) contribute to motion estimation. Importantly, however, it is the balance between their activity that distinguishes whether visual and vestibular cues should be integrated or separated. This explains the computational purpose of puzzling neural representations and shows how a relatively simple feedforward network can solve causal inference.


1998 ◽  
Vol 79 (3) ◽  
pp. 1461-1480 ◽  
Author(s):  
Markus Lappe ◽  
Martin Pekel ◽  
Klaus-Peter Hoffmann

Lappe, Markus, Martin Pekel, and Klaus-Peter Hoffmann. Optokinetic eye movements elicited by radial optic flow in the macaque monkey. J. Neurophysiol. 79: 1461–1480, 1998. We recorded spontaneous eye movements elicited by radial optic flow in three macaque monkeys using the scleral search coil technique. Computer-generated stimuli simulated forward or backward motion of the monkey with respect to a number of small illuminated dots arranged on a virtual ground plane. We wanted to see whether optokinetic eye movements are induced by radial optic flow stimuli that simulate self-movement, quantify their parameters, and consider their effects on the processing of optic flow. A regular pattern of interchanging fast and slow eye movements with a frequency of 2 Hz was observed. When we shifted the horizontal position of the focus of expansion (FOE) during simulated forward motion (expansional optic flow), median horizontal eye position also shifted in the same direction but only by a smaller amount; for simulated backward motion (contractional optic flow), median eye position shifted in the opposite direction. We relate this to a change in Schlagfeld typically observed in optokinetic nystagmus. Direction and speed of slow phase eye movements were compared with the local flow field motion in gaze direction (the foveal flow). Eye movement direction matched well the foveal motion. Small systematic deviations could be attributed to an integration of the global motion pattern. Eye speed on average did not match foveal stimulus speed, as the median gain was only ∼0.5–0.6. The gain was always lower for expanding than for contracting stimuli. We analyzed the time course of the eye movement immediately after each saccade. We found remarkable differences in the initial development of gain and directional following for expansion and contraction. For expansion, directional following and gain were initially poor and strongly influenced by the ongoing eye movement before the saccade. This was not the case for contraction. These differences also can be linked to properties of the optokinetic system. We conclude that optokinetic eye movements can be elicited by radial optic flow fields simulating self-motion. These eye movements are linked to the parafoveal flow field, i.e., the motion in the direction of gaze. In the retinal projection of the optic flow, such eye movements superimpose retinal slip. This results in complex retinal motion patterns, especially because the gain of the eye movement is small and variable. This observation has special relevance for mechanisms that determine self-motion from retinal flow fields. It is necessary to consider the influence of eye movements in optic flow analysis, but our results suggest that direction and speed of an eye movement should be treated differently.


2016 ◽  
Author(s):  
Kit D. Longden ◽  
Martina Wicklein ◽  
Benjamin J. Hardcastle ◽  
Stephen J. Huston ◽  
Holger G. Krapp

SummaryMany animals use the visual motion generated by travelling in a line, the translatory optic flow, to successfully navigate obstacles: near objects appear larger and to move more quickly than distant ones. Flies are experts at navigating cluttered environments, and while their visual processing of rotatory optic flow is understood in exquisite detail, how they process translatory optic flow remains a mystery. Here, we present novel cell types that have motion receptive fields matched to translation self-motion, the vertical translation (VT) cells. One of these, the VT1 cell, encodes forwards sideslip self-motion, and fires action potentials in clusters of spikes, spike bursts. We show that the spike burst coding is size and speed-tuned, and is selectively modulated by parallax motion, the relative motion experienced during translation. These properties are spatially organized, so that the cell is most excited by clutter rather than isolated objects. When the fly is presented with a simulation of flying past an elevated object, the spike burst activity is modulated by the height of the object, and the single spike rate is unaffected. When the moving object alone is experienced, the cell is weakly driven. Meanwhile, the VT2-3 cells have motion receptive fields matched to the lift axis. In conjunction with previously described horizontal cells, the VT cells have the properties required for the fly to successfully navigate clutter and encode its movements along near cardinal axes of thrust, lift and forward sideslip.


Perception ◽  
1996 ◽  
Vol 25 (1_suppl) ◽  
pp. 61-61
Author(s):  
A Grigo ◽  
M Lappe

We investigated the influence of stereoscopic vision on the perception of optic flow fields in psychophysical experiments based on the effect of an illusory transformation found by Duffy and Wurtz (1993 Vision Research33 1481 – 1490). Human subjects are not able to determine the centre of an expanding optic flow field correctly if the expansion is transparently superimposed on a unidirectional motion pattern. Its location is rather perceived shifted in the direction of the translational movement. Duffy and Wurtz proposed that this illusory shift is caused by the visual system taking the presented flow pattern as a flow field composed of linear self-motion and an eye rotation. As a consequence, the centre of the expansional movement is determined by compensating for the simulated eye rotation, like determining one's direction of heading (Lappe and Rauschecker, 1994 Vision Research35 1619 – 1631). In our experiments we examined the dependence of the illusory transformation on differences in depth between the superimposed movements. We presented the expansional and translational stimuli with different relative binocular disparities. In the case of zero disparity, we could confirm the results of Duffy and Wurtz. For uncrossed disparities (ie translation behind expansion) we found a small and nonsignificant decrease of the illusory shift. In contrast, there was a strong decrease up to 80% in the case of crossed disparity (ie translation in front of expansion). These findings confirm the assumption that the motion pattern is interpreted as a self-motion flow field: only in the unrealistic case of a large rotational component present in front of an expansion are the superimposed movements interpreted separately by the visual system.


2021 ◽  
Author(s):  
Miriam Henning ◽  
Giordano Ramos-Traslosheros ◽  
Burak Gür ◽  
Marion Silies

Nervous systems allocate computational resources to match stimulus statistics. However, the physical information that needs to be processed depends on the animal's own behavior. For example, visual motion patterns induced by self-motion provide essential information for navigation. How behavioral constraints affect neural processing is not known. Here we show that, at the population level, local direction-selective T4/T5 neurons in Drosophila represent optic flow fields generated by self-motion, reminiscent to a population code in retinal ganglion cells in vertebrates. Whereas in vertebrates four different cell types encode different optic flow fields, the four uniformly tuned T4/T5 subtypes described previously represent a local snapshot. As a population, six T4/T5 subtypes encode different axes of self-motion. This representation might serve to efficiently encode more complex flow fields generated during flight. Thus, a population code for optic flow appears to be a general coding principle of visual systems, but matching the animal's individual ethological constraints.


Author(s):  
Raul Rodriguez ◽  
Benjamin Thomas Crane

Heading direction is perceived based on visual and inertial cues. The current study examined the effect of their relative timing on the ability of offset visual headings to influence inertial perception. Seven healthy human subjects experienced 2 s of translation along a heading of 0°, ±35°, ±70°, ±105°, or ±140°. These inertial headings were paired with 2 s duration visual headings that were presented at relative offsets of 0°, ±30°, ±60°, ±90°, or ±120. The visual stimuli were also presented at 17 temporal delays ranging from -500 ms (visual lead) to 2,000 ms (visual delay) relative to the inertial stimulus. After each stimulus, subjects reported the direction of the inertial stimulus using a dial. The bias of the inertial heading towards the visual heading was robust at ±250 ms when examined across subjects during this period: 8.0 ± 0.5° with a 30° offset, 12.2 ± 0.5° with a 60° offset, 11.7 ± 0.6° with a 90° offset, and 9.8 ± 0.7° with a 120° offset (mean bias towards visual ± SE). The mean bias was much diminished with temporal misalignments of ±500 ms, and there was no longer any visual influence on the inertial heading when the visual stimulus was delayed by 1,000 ms or more. Although the amount of bias varied between subjects the effect of delay was similar.


2012 ◽  
Vol 108 (3) ◽  
pp. 794-801 ◽  
Author(s):  
Velia Cardin ◽  
Lara Hemsworth ◽  
Andrew T. Smith

The extraction of optic flow cues is fundamental for successful locomotion. During forward motion, the focus of expansion (FoE), in conjunction with knowledge of eye position, indicates the direction in which the individual is heading. Therefore, it is expected that cortical brain regions that are involved in the estimation of heading will be sensitive to this feature. To characterize cortical sensitivity to the location of the FoE or, more generally, the center of flow (CoF) during visually simulated self-motion, we carried out a functional MRI (fMRI) adaptation experiment in several human visual cortical areas that are thought to be sensitive to optic flow parameters, namely, V3A, V6, MT/V5, and MST. In each trial, two optic flow patterns were sequentially presented, with the CoF located in either the same or different positions. With an adaptation design, an area sensitive to heading direction should respond more strongly to a pair of stimuli with different CoFs than to stimuli with the same CoF. Our results show such release from adaptation in areas MT/V5 and MST, and to a lesser extent V3A, suggesting the involvement of these areas in the processing of heading direction. The effect could not be explained either by differences in local motion or by attention capture. It was not observed to a significant extent in area V6 or in control area V1. The different patterns of responses observed in MST and V6, areas that are both involved in the processing of egomotion in macaques and humans, suggest distinct roles in the processing of visual cues for self-motion.


Perception ◽  
10.1068/p5846 ◽  
2007 ◽  
Vol 36 (10) ◽  
pp. 1477-1485 ◽  
Author(s):  
Nadia Bolognini ◽  
Fabrizio Leor ◽  
Claudia Passamonti ◽  
Barry E Stein ◽  
Elisabetta Làdavas

Multisensory integration is a powerful mechanism for maximizing sensitivity to sensory events. We examined its effects on auditory localization in healthy human subjects. The specific objective was to test whether the relative intensity and location of a seemingly irrelevant visual stimulus would influence auditory localization in accordance with the inverse effectiveness and spatial rules of multisensory integration that have been developed from neurophysiological studies with animals [Stein and Meredith, 1993 The Merging of the Senses (Cambridge, MA: MIT Press)]. Subjects were asked to localize a sound in one condition in which a neutral visual stimulus was either above threshold (supra-threshold) or at threshold. In both cases the spatial disparity of the visual and auditory stimuli was systematically varied. The results reveal that stimulus salience is a critical factor in determining the effect of a neutral visual cue on auditory localization. Visual bias and, hence, perceptual translocation of the auditory stimulus appeared when the visual stimulus was supra-threshold, regardless of its location. However, this was not the case when the visual stimulus was at threshold. In this case, the influence of the visual cue was apparent only when the two cues were spatially coincident and resulted in an enhancement of stimulus localization. These data suggest that the brain uses multiple strategies to integrate multisensory information.


Sign in / Sign up

Export Citation Format

Share Document