scholarly journals Author response: Neuronal variability and tuning are balanced to optimize naturalistic self-motion coding in primate vestibular pathways

2018 ◽  
Author(s):  
Diana E Mitchell ◽  
Annie Kwan ◽  
Jerome Carriot ◽  
Maurice J Chacron ◽  
Kathleen E Cullen
eLife ◽  
2018 ◽  
Vol 7 ◽  
Author(s):  
Diana E Mitchell ◽  
Annie Kwan ◽  
Jerome Carriot ◽  
Maurice J Chacron ◽  
Kathleen E Cullen

It is commonly assumed that the brain’s neural coding strategies are adapted to the statistics of natural stimuli. Specifically, to maximize information transmission, a sensory neuron’s tuning function should effectively oppose the decaying stimulus spectral power, such that the neural response is temporally decorrelated (i.e. ‘whitened’). However, theory predicts that the structure of neuronal variability also plays an essential role in determining how coding is optimized. Here, we provide experimental evidence supporting this view by recording from neurons in early vestibular pathways during naturalistic self-motion. We found that central vestibular neurons displayed temporally whitened responses that could not be explained by their tuning alone. Rather, computational modeling and analysis revealed that neuronal variability and tuning were matched to effectively complement natural stimulus statistics, thereby achieving temporal decorrelation and optimizing information transmission. Taken together, our findings reveal a novel strategy by which neural variability contributes to optimized processing of naturalistic stimuli.


Author(s):  
Kathleen E. Cullen

As we go about our everyday activities, our brain computes accurate estimates of both our motion relative to the world, and of our orientation relative to gravity. Essential to this computation is the information provided by the vestibular system; it detects the rotational velocity and linear acceleration of our heads relative to space, making a fundamental contribution to our perception of self-motion and spatial orientation. Additionally, in everyday life, our perception of self-motion depends on the integration of both vestibular and nonvestibular cues, including visual and proprioceptive information. Furthermore, the integration of motor-related information is also required for perceptual stability, so that the brain can distinguish whether the experienced sensory inflow was a result of active self-motion through the world or if instead self-motion that was externally generated. To date, understanding how the brain encodes and integrates sensory cues with motor signals for the perception of self-motion during natural behaviors remains a major goal in neuroscience. Recent experiments have (i) provided new insights into the neural code used to represent sensory information in vestibular pathways, (ii) established that vestibular pathways are inherently multimodal at the earliest stages of processing, and (iii) revealed that self-motion information processing is adjusted to meet the needs of specific tasks. Our current level of understanding of how the brain integrates sensory information and motor-related signals to encode self-motion and ensure perceptual stability during everyday activities is reviewed.


2020 ◽  
Author(s):  
Isabelle Mackrous ◽  
Jérome Carriot ◽  
Kathleen E Cullen ◽  
Maurice J Chacron

2020 ◽  
Author(s):  
Orly Halperin ◽  
Simon Israeli-Korn ◽  
Sol Yakubovich ◽  
Sharon Hassin-Baer ◽  
Adam Zaidel

2021 ◽  
Vol 73 (1) ◽  
Author(s):  
Jean-Paul Noel ◽  
Dora E. Angelaki

Navigating by path integration requires continuously estimating one's self-motion. This estimate may be derived from visual velocity and/or vestibular acceleration signals. Importantly, these senses in isolation are ill-equipped to provide accurate estimates, and thus visuo-vestibular integration is an imperative. After a summary of the visual and vestibular pathways involved, the crux of this review focuses on the human and theoretical approaches that have outlined a normative account of cue combination in behavior and neurons, as well as on the systems neuroscience efforts that are searching for its neural implementation. We then highlight a contemporary frontier in our state of knowledge: understanding how velocity cues with time-varying reliabilities are integrated into an evolving position estimate over prolonged time periods. Further, we discuss how the brain builds internal models inferring when cues ought to be integrated versus segregated—a process of causal inference. Lastly, we suggest that the study of spatial navigation has not yet addressed its initial condition: self-location. Expected final online publication date for the Annual Review of Psychology, Volume 73 is January 2022. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.


2018 ◽  
Vol 120 (4) ◽  
pp. 2091-2106 ◽  
Author(s):  
Malcolm G. Campbell ◽  
Lisa M. Giocomo

The sensory signals generated by self-motion are complex and multimodal, but the ability to integrate these signals into a unified self-motion percept to guide navigation is essential for animal survival. Here, we summarize classic and recent work on self-motion coding in the visual and entorhinal cortices of the rodent brain. We compare motion processing in rodent and primate visual cortices, highlighting the strengths of classic primate work in establishing causal links between neural activity and perception, and discuss the integration of motor and visual signals in rodent visual cortex. We then turn to the medial entorhinal cortex (MEC), where calculations using self-motion to update position estimates are thought to occur. We focus on several key sources of self-motion information to MEC: the medial septum, which provides locomotor speed information; visual cortex, whose input has been increasingly recognized as essential to both position and speed-tuned MEC cells; and the head direction system, which is a major source of directional information for self-motion estimates. These inputs create a large and diverse group of self-motion codes in MEC, and great interest remains in how these self-motion codes might be integrated by MEC grid cells to estimate position. However, which signals are used in these calculations and the mechanisms by which they are integrated remain controversial. We end by proposing future experiments that could further our understanding of the interactions between MEC cells that code for self-motion and position and clarify the relationship between the activity of these cells and spatial perception.


Sign in / Sign up

Export Citation Format

Share Document