scholarly journals Neural activity underlying the detection of an object movement by an observer during forward self-motion: Dynamic decoding and temporal evolution of directional cortical connectivity

2020 ◽  
Vol 195 ◽  
pp. 101824
Author(s):  
N. Kozhemiako ◽  
A.S. Nunes ◽  
A. Samal ◽  
K.D. Rana ◽  
F.J. Calabro ◽  
...  
2015 ◽  
Vol 36 (11) ◽  
pp. 4714-4729 ◽  
Author(s):  
Kiyohide Usami ◽  
Riki Matsumoto ◽  
Katsuya Kobayashi ◽  
Takefumi Hitomi ◽  
Akihiro Shimotake ◽  
...  

2009 ◽  
Vol 65 ◽  
pp. S104
Author(s):  
Yu Shimizu ◽  
Hiroshi Imamizu ◽  
Masaaki Sato ◽  
Mitsuo Kawato

2016 ◽  
Vol 115 (1) ◽  
pp. 286-300 ◽  
Author(s):  
Oliver W. Layton ◽  
Brett R. Fajen

Many forms of locomotion rely on the ability to accurately perceive one's direction of locomotion (i.e., heading) based on optic flow. Although accurate in rigid environments, heading judgments may be biased when independently moving objects are present. The aim of this study was to systematically investigate the conditions in which moving objects influence heading perception, with a focus on the temporal dynamics and the mechanisms underlying this bias. Subjects viewed stimuli simulating linear self-motion in the presence of a moving object and judged their direction of heading. Experiments 1 and 2 revealed that heading perception is biased when the object crosses or almost crosses the observer's future path toward the end of the trial, but not when the object crosses earlier in the trial. Nonetheless, heading perception is not based entirely on the instantaneous optic flow toward the end of the trial. This was demonstrated in Experiment 3 by varying the portion of the earlier part of the trial leading up to the last frame that was presented to subjects. When the stimulus duration was long enough to include the part of the trial before the moving object crossed the observer's path, heading judgments were less biased. The findings suggest that heading perception is affected by the temporal evolution of optic flow. The time course of dorsal medial superior temporal area (MSTd) neuron responses may play a crucial role in perceiving heading in the presence of moving objects, a property not captured by many existing models.


eLife ◽  
2014 ◽  
Vol 3 ◽  
Author(s):  
Yong Gu ◽  
Dora E Angelaki ◽  
Gregory C DeAngelis

Trial by trial covariations between neural activity and perceptual decisions (quantified by choice Probability, CP) have been used to probe the contribution of sensory neurons to perceptual decisions. CPs are thought to be determined by both selective decoding of neural activity and by the structure of correlated noise among neurons, but the respective roles of these factors in creating CPs have been controversial. We used biologically-constrained simulations to explore this issue, taking advantage of a peculiar pattern of CPs exhibited by multisensory neurons in area MSTd that represent self-motion. Although models that relied on correlated noise or selective decoding could both account for the peculiar pattern of CPs, predictions of the selective decoding model were substantially more consistent with various features of the neural and behavioral data. While correlated noise is essential to observe CPs, our findings suggest that selective decoding of neuronal signals also plays important roles.


2006 ◽  
Vol 17 (6) ◽  
pp. 1350-1363 ◽  
Author(s):  
Alexandra Battaglia-Mayer ◽  
Massimo Mascaro ◽  
Roberto Caminiti

2011 ◽  
Vol 278 (1719) ◽  
pp. 2840-2847 ◽  
Author(s):  
F. J. Calabro ◽  
S. Soto-Faraco ◽  
L. M. Vaina

In humans, as well as most animal species, perception of object motion is critical to successful interaction with the surrounding environment. Yet, as the observer also moves, the retinal projections of the various motion components add to each other and extracting accurate object motion becomes computationally challenging. Recent psychophysical studies have demonstrated that observers use a flow-parsing mechanism to estimate and subtract self-motion from the optic flow field. We investigated whether concurrent acoustic cues for motion can facilitate visual flow parsing, thereby enhancing the detection of moving objects during simulated self-motion. Participants identified an object (the target) that moved either forward or backward within a visual scene containing nine identical textured objects simulating forward observer translation. We found that spatially co-localized, directionally congruent, moving auditory stimuli enhanced object motion detection. Interestingly, subjects who performed poorly on the visual-only task benefited more from the addition of moving auditory stimuli. When auditory stimuli were not co-localized to the visual target, improvements in detection rates were weak. Taken together, these results suggest that parsing object motion from self-motion-induced optic flow can operate on multisensory object representations.


2021 ◽  
pp. 147489
Author(s):  
Lucia M. Vaina ◽  
Finnegan J. Calabro ◽  
Abhisek Samal ◽  
Kunjan D. Rana ◽  
Fahimeh Mamashli ◽  
...  

Author(s):  
Kobina G. Mensah-Brown ◽  
James Lim ◽  
Dennis Jgamadze ◽  
Guo-li Ming ◽  
Hongjun Song ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document