Linking sensory neurons to visually guided behavior: Relating MST activity to steering in a virtual environment

2013 ◽  
Vol 30 (5-6) ◽  
pp. 315-330 ◽  
Author(s):  
SETH W. EGGER ◽  
KENNETH H. BRITTEN

AbstractMany complex behaviors rely on guidance from sensations. To perform these behaviors, the motor system must decode information relevant to the task from the sensory system. However, identifying the neurons responsible for encoding the appropriate sensory information remains a difficult problem for neurophysiologists. A key step toward identifying candidate systems is finding neurons or groups of neurons capable of representing the stimuli adequately to support behavior. A traditional approach involves quantitatively measuring the performance of single neurons and comparing this to the performance of the animal. One of the strongest pieces of evidence in support of a neuronal population being involved in a behavioral task comes from the signals being sufficient to support behavior. Numerous experiments using perceptual decision tasks show that visual cortical neurons in many areas have this property. However, most visually guided behaviors are not categorical but continuous and dynamic. In this article, we review the concept of sufficiency and the tools used to measure neural and behavioral performance. We show how concepts from information theory can be used to measure the ongoing performance of both neurons and animal behavior. Finally, we apply these tools to dorsal medial superior temporal (MSTd) neurons and demonstrate that these neurons can represent stimuli important to navigation to a distant goal. We find that MSTd neurons represent ongoing steering error in a virtual-reality steering task. Although most individual neurons were insufficient to support the behavior, some very nearly matched the animal’s estimation performance. These results are consistent with many results from perceptual experiments and in line with the predictions of Mountcastle’s “lower envelope principle.”

2019 ◽  
Author(s):  
Subhodh Kotekal ◽  
Jason N. MacLean

1.AbstractTo develop a complete description of sensory encoding, it is necessary to account for trial-to-trial variability in cortical neurons. Using a generalized linear model with terms corresponding to the visual stimulus, mouse running speed, and experimentally measured neuronal correlations, we modeled short term dynamics of L2/3 murine visual cortical neurons to evaluate the relative importance of each factor to neuronal variability within single trials. We find single trial predictions improve most when conditioning on the experimentally measured local correlations in comparison to predictions based on the stimulus or running speed. Specifically, accurate predictions are driven by positively co-varying and synchronously active functional groups of neurons. Including functional groups in the model enhances decoding accuracy of sensory information compared to a model that assumes neuronal independence. Functional groups, in encoding and decoding frameworks, provide an operational definition of Hebbian assemblies in which local correlations largely explain neuronal responses on individual trials.


2021 ◽  
Author(s):  
Yiyi Yu ◽  
Jeffrey N. Stirman ◽  
Christopher R. Dorsett ◽  
Spencer L. Smith

Mice have a constellation of higher visual areas, but their functional specializations are unclear. Here, we used a data-driven approach to examine neuronal representations of complex visual stimuli across mouse higher visual areas, measured using large field-of-view two-photon calcium imaging. Using specialized stimuli, we found higher fidelity representations of texture in area LM, compared to area AL. Complementarily, we found higher fidelity representations of motion in area AL, compared to area LM. We also observed this segregation of information in response to naturalistic videos. Finally, we explored how popular models of visual cortical neurons could produce the segregated representations of texture and motion we observed. These selective representations could aid in behaviors such as visually guided navigation.


2021 ◽  
Author(s):  
João D. Semedo ◽  
Anna I. Jasper ◽  
Amin Zandvakili ◽  
Amir Aschner ◽  
Christian K. Machens ◽  
...  

AbstractBrain function relies on the coordination of activity across multiple, recurrently connected, brain areas. For instance, sensory information encoded in early sensory areas is relayed to, and further processed by, higher cortical areas and then fed back. However, the way in which feedforward and feedback signaling interact with one another is incompletely understood. Here we investigate this question by leveraging simultaneous neuronal population recordings in early and midlevel visual areas (V1-V2 and V1-V4). Using a dimensionality reduction approach, we find that population interactions are feedforward-dominated shortly after stimulus onset and feedback-dominated during spontaneous activity. The population activity patterns most correlated across areas were distinct during feedforward- and feedback-dominated periods. These results suggest that feedforward and feedback signaling rely on separate “channels”, such that feedback signaling does not directly affect activity that is fed forward.


2020 ◽  
Vol 132 (6) ◽  
pp. 2000-2007 ◽  
Author(s):  
Soroush Niketeghad ◽  
Abirami Muralidharan ◽  
Uday Patel ◽  
Jessy D. Dorn ◽  
Laura Bonelli ◽  
...  

Stimulation of primary visual cortices has the potential to restore some degree of vision to blind individuals. Developing safe and reliable visual cortical prostheses requires assessment of the long-term stability, feasibility, and safety of generating stimulation-evoked perceptions.A NeuroPace responsive neurostimulation system was implanted in a blind individual with an 8-year history of bare light perception, and stimulation-evoked phosphenes were evaluated over 19 months (41 test sessions). Electrical stimulation was delivered via two four-contact subdural electrode strips implanted over the right medial occipital cortex. Current and charge thresholds for eliciting visual perception (phosphenes) were measured, as were the shape, size, location, and intensity of the phosphenes. Adverse events were also assessed.Stimulation of all contacts resulted in phosphene perception. Phosphenes appeared completely or partially in the left hemifield. Stimulation of the electrodes below the calcarine sulcus elicited phosphenes in the superior hemifield and vice versa. Changing the stimulation parameters of frequency, pulse width, and burst duration affected current thresholds for eliciting phosphenes, and increasing the amplitude or frequency of stimulation resulted in brighter perceptions. While stimulation thresholds decreased between an average of 5% and 12% after 19 months, spatial mapping of phosphenes remained consistent over time. Although no serious adverse events were observed, the subject experienced mild headaches and dizziness in three instances, symptoms that did not persist for more than a few hours and for which no clinical intervention was required.Using an off-the-shelf neurostimulator, the authors were able to reliably generate phosphenes in different areas of the visual field over 19 months with no serious adverse events, providing preliminary proof of feasibility and safety to proceed with visual epicortical prosthetic clinical trials. Moreover, they systematically explored the relationship between stimulation parameters and phosphene thresholds and discovered the direct relation of perception thresholds based on primary visual cortex (V1) neuronal population excitation thresholds.


2020 ◽  
Author(s):  
Lukas Klimmasch ◽  
Johann Schneider ◽  
Alexander Lelais ◽  
Bertram E. Shi ◽  
Jochen Triesch

AbstractThe development of binocular vision is an active learning process comprising the development of disparity tuned neurons in visual cortex and the establishment of precise vergence control of the eyes. We present a computational model for the learning and self-calibration of active binocular vision based on the Active Efficient Coding framework, an extension of classic efficient coding ideas to active perception. Under normal rearing conditions, the model develops disparity tuned neurons and precise vergence control, allowing it to correctly interpret random dot stereogramms. Under altered rearing conditions modeled after neurophysiological experiments, the model qualitatively reproduces key experimental findings on changes in binocularity and disparity tuning. Furthermore, the model makes testable predictions regarding how altered rearing conditions impede the learning of precise vergence control. Finally, the model predicts a surprising new effect that impaired vergence control affects the statistics of orientation tuning in visual cortical neurons.


eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Behrad Noudoost ◽  
Kelsey Lynne Clark ◽  
Tirin Moore

Visually guided behavior relies on the integration of sensory input and information held in working memory (WM). Yet it remains unclear how this is accomplished at the level of neural circuits. We studied the direct visual cortical inputs to neurons within a visuomotor area of prefrontal cortex in behaving monkeys. We show that the efficacy of visual input to prefrontal cortex is gated by information held in WM. Surprisingly, visual input to prefrontal neurons was found to target those with both visual and motor properties, rather than preferentially targeting other visual neurons. Furthermore, activity evoked from visual cortex was larger in magnitude, more synchronous, and more rapid, when monkeys remembered locations that matched the location of visual input. These results indicate that WM directly influences the circuitry that transforms visual input into visually guided behavior.


2017 ◽  
Vol 51 (5) ◽  
pp. 103-115 ◽  
Author(s):  
Kevin Nelson ◽  
Kamran Mohseni

AbstractThis paper presents a sensory system that is biologically inspired by the lateral line sensory system found in fish. This artificial lateral line system provides sensory information to be used in vehicle control algorithms, both to reduce model complexity and to measure hydrodynamic disturbances. The system presented in this paper is a modular implementation that can fit around a vehicle without requiring modifications to the hull. The design and manufacturing processes are presented in detail along with considerations for sensor placement and port spacing. An algorithm for calculating the hydrodynamic forces acting on the surface of a vehicle is derived and experimentally validated. An underwater motion capture system and strain sensors are used to calculate a reference hydrodynamic force that compares favorably with the hydrodynamic force calculated by the lateral line system.


2008 ◽  
Vol 100 (3) ◽  
pp. 1476-1487 ◽  
Author(s):  
Bin Zhang ◽  
Earl L. Smith ◽  
Yuzo M. Chino

Vision of newborn infants is limited by immaturities in their visual brain. In adult primates, the transient onset discharges of visual cortical neurons are thought to be intimately involved with capturing the rapid succession of brief images in visual scenes. Here we sought to determine the responsiveness and quality of transient responses in individual neurons of the primary visual cortex (V1) and visual area 2 (V2) of infant monkeys. We show that the transient component of neuronal firing to 640-ms stationary gratings was as robust and as reliable as in adults only 2 wk after birth, whereas the sustained component was more sluggish in infants than in adults. Thus the cortical circuitry supporting onset transient responses is functionally mature near birth, and our findings predict that neonates, known for their “impoverished vision,” are capable of initiating relatively mature fixating eye movements and of performing in detection of simple objects far better than traditionally thought.


Sign in / Sign up

Export Citation Format

Share Document