scholarly journals Behavioral state tunes mouse vision to ethological features through pupil dilation

2021 ◽  
Author(s):  
Katrin Franke ◽  
Konstantin F. Willeke ◽  
Kayla Ponder ◽  
Mario Galdamez ◽  
Taliah Muhammad ◽  
...  

Across animal species, sensory processing dynamically adapts to behavioral context. In the mammalian visual system, sensory neural responses and behavioral performance increase during an active behavioral state characterized by locomotion activity and pupil dilation, whereas preferred stimuli of individual neurons typically remain unchanged. Here, we address how behavioral states modulate stimulus selectivity in the context of colored natural scenes using a combination of large-scale population imaging, behavior, pharmacology, and deep neural network modeling. In visual cortex of awake mice, we identified a consistent shift of individual neuron color preferences towards ultraviolet stimuli during active behavioral periods that was particularly pronounced in the upper visual field. We found that the spectral shift in neural tuning is mediated by pupil dilation, resulting in a dynamic switch from rod- to cone-driven visual responses for constant ambient light levels. We further showed that this shift selectively enhances the discriminability of ultraviolet objects and facilitates the detection of ethological stimuli, such as aerial predators against the ultraviolet background of the twilight sky. Our results suggest a novel functional role for pupil dilation during active behavioral states as a bottom-up mechanism that, together with top-down neuromodulatory mechanisms, dynamically tunes visual representations to different behavioral demands.

2021 ◽  
Author(s):  
Colin Conwell ◽  
David Mayo ◽  
Boris Katz ◽  
Michael A. Buice ◽  
George A. Alvarez ◽  
...  

How well do deep neural networks fare as models of mouse visual cortex? A majority of research to date suggests results far more mixed than those produced in the modeling of primate visual cortex. Here, we perform a large-scale benchmarking of dozens of deep neural network models in mouse visual cortex with multiple methods of comparison and multiple modes of verification. Using the Allen Brain Observatory's 2-photon calcium-imaging dataset of activity in over 59,000 rodent visual cortical neurons recorded in response to natural scenes, we replicate previous findings and resolve previous discrepancies, ultimately demonstrating that modern neural networks can in fact be used to explain activity in the mouse visual cortex to a more reasonable degree than previously suggested. Using our benchmark as an atlas, we offer preliminary answers to overarching questions about levels of analysis (e.g. do models that better predict the representations of individual neurons also predict representational geometry across neural populations?); questions about the properties of models that best predict the visual system overall (e.g. does training task or architecture matter more for augmenting predictive power?); and questions about the mapping between biological and artificial representations (e.g. are there differences in the kinds of deep feature spaces that predict neurons from primary versus posteromedial visual cortex?). Along the way, we introduce a novel, highly optimized neural regression method that achieves SOTA scores (with gains of up to 34%) on the publicly available benchmarks of primate BrainScore. Simultaneously, we benchmark a number of models (including vision transformers, MLP-Mixers, normalization free networks and Taskonomy encoders) outside the traditional circuit of convolutional object recognition. Taken together, our results provide a reference point for future ventures in the deep neural network modeling of mouse visual cortex, hinting at novel combinations of method, architecture, and task to more fully characterize the computational motifs of visual representation in a species so indispensable to neuroscience.


2020 ◽  
Vol 128 (10-11) ◽  
pp. 2665-2683 ◽  
Author(s):  
Grigorios G. Chrysos ◽  
Jean Kossaifi ◽  
Stefanos Zafeiriou

Abstract Conditional image generation lies at the heart of computer vision and conditional generative adversarial networks (cGAN) have recently become the method of choice for this task, owing to their superior performance. The focus so far has largely been on performance improvement, with little effort in making cGANs more robust to noise. However, the regression (of the generator) might lead to arbitrarily large errors in the output, which makes cGANs unreliable for real-world applications. In this work, we introduce a novel conditional GAN model, called RoCGAN, which leverages structure in the target space of the model to address the issue. Specifically, we augment the generator with an unsupervised pathway, which promotes the outputs of the generator to span the target manifold, even in the presence of intense noise. We prove that RoCGAN share similar theoretical properties as GAN and establish with both synthetic and real data the merits of our model. We perform a thorough experimental validation on large scale datasets for natural scenes and faces and observe that our model outperforms existing cGAN architectures by a large margin. We also empirically demonstrate the performance of our approach in the face of two types of noise (adversarial and Bernoulli).


2019 ◽  
Vol 42 (7) ◽  
pp. 1358-1374
Author(s):  
Tim Chen ◽  
CYJ Chen

This paper is concerned with the stability analysis and the synthesis of model-based fuzzy controllers for a nonlinear large-scale system. In evolved fuzzy NN (neural network) modeling, the NN model and LDI (linear differential inclusion) representation are established for the arbitrary nonlinear dynamics. The evolved bat algorithm (EBA) is first incorporated in the controlled algorithm of stability conditions, which could rapidly find the optimal solution and raise the control performance. This representation is constructed by taking advantage of sector nonlinearity that converts the nonlinear model to a multiple rule base linear model. A new sufficient condition guaranteeing asymptotic stability is implemented via the Lyapunov function in terms of linear matrix inequalities. Subsequently, based on this criterion and the decentralized control scheme, an evolved model-based fuzzy H infinity set is synthesized to stabilize the nonlinear large-scale system. Finally, a numerical example and simulation is given to illustrate the results.


eLife ◽  
2016 ◽  
Vol 5 ◽  
Author(s):  
Janelle MP Pakan ◽  
Scott C Lowe ◽  
Evelyn Dylda ◽  
Sander W Keemink ◽  
Stephen P Currie ◽  
...  

Cortical responses to sensory stimuli are modulated by behavioral state. In the primary visual cortex (V1), visual responses of pyramidal neurons increase during locomotion. This response gain was suggested to be mediated through inhibitory neurons, resulting in the disinhibition of pyramidal neurons. Using in vivo two-photon calcium imaging in layers 2/3 and 4 in mouse V1, we reveal that locomotion increases the activity of vasoactive intestinal peptide (VIP), somatostatin (SST) and parvalbumin (PV)-positive interneurons during visual stimulation, challenging the disinhibition model. In darkness, while most VIP and PV neurons remained locomotion responsive, SST and excitatory neurons were largely non-responsive. Context-dependent locomotion responses were found in each cell type, with the highest proportion among SST neurons. These findings establish that modulation of neuronal activity by locomotion is context-dependent and contest the generality of a disinhibitory circuit for gain control of sensory responses by behavioral state.


2020 ◽  
Author(s):  
Yongrong Qiu ◽  
Zhijian Zhao ◽  
David Klindt ◽  
Magdalena Kautzky ◽  
Klaudia P. Szatko ◽  
...  

SummaryPressures for survival drive sensory circuit adaption to a species’ habitat, making it essential to statistically characterise natural scenes. Mice, a prominent visual system model, are dichromatic with enhanced sensitivity to green and UV. Their visual environment, however, is rarely considered. Here, we built a UV-green camera to record footage from mouse habitats. We found chromatic contrast to greatly diverge in the upper but not the lower visual field, an environmental difference that may underlie the species’ superior colour discrimination in the upper visual field. Moreover, training an autoencoder on upper but not lower visual field scenes was sufficient for the emergence of colour-opponent filters. Furthermore, the upper visual field was biased towards dark UV contrasts, paralleled by more light-offset-sensitive cells in the ventral retina. Finally, footage recorded at twilight suggests that UV promotes aerial predator detection. Our findings support that natural scene statistics shaped early visual processing in evolution.Lead contactFurther information and requests for resources and reagents should be directed to and will be fulfilled by the Lead Contact, Thomas Euler ([email protected])


2019 ◽  
Author(s):  
Omid Talakoub ◽  
Patricia Sayegh ◽  
Thilo Womelsdorf ◽  
Wolf Zinke ◽  
Pascal Fries ◽  
...  

AbstractWireless recordings in macaque neocortex and hippocampus showed stronger theta oscillations during early-stage sleep than during alert volitional movement including walking. In contrast, hippocampal beta and gamma oscillations were prominent during walking and other active behaviors. These relations between hippocampal rhythms and behavioral states in the primate differ markedly from those observed in rodents. Primate neocortex showed similar changes in spectral content across behavioral state as the hippocampus.


2020 ◽  
Author(s):  
Ruidong Chen ◽  
Vikram Gadagkar ◽  
Andrea C. Roeser ◽  
Pavel A. Puzerey ◽  
Jesse H. Goldberg

AbstractMovement-related neuronal discharge in ventral tegmental area (VTA) and ventral pallidum (VP) is inconsistently observed across studies. One possibility is that some neurons are movement-related and others are not. Another possibility is that the precise behavioral conditions matter - that a single neuron can be movement related under certain behavioral states but not others. We recorded single VTA and VP neurons in birds transitioning between singing and non-singing states, while monitoring body movement with microdrive-mounted accelerometers. Many VP and VTA neurons exhibited body movement-locked activity exclusively when the bird was not singing. During singing, VP and VTA neurons could switch off their tuning to body movement and become instead precisely time-locked to specific song syllables. These changes in neuronal tuning occurred rapidly at state boundaries. Our findings show that movement-related activity in limbic circuits can be gated by behavioral context.


2019 ◽  
Vol 87 (3) ◽  
pp. 576-580 ◽  
Author(s):  
Joanna J. Parga ◽  
Sharon Lewin ◽  
Juanita Lewis ◽  
Diana Montoya-Williams ◽  
Abeer Alwan ◽  
...  

Abstract Background To characterize acoustic features of an infant’s cry and use machine learning to provide an objective measurement of behavioral state in a cry-translator. To apply the cry-translation algorithm to colic hypothesizing that these cries sound painful. Methods Assessment of 1000 cries in a mobile app (ChatterBabyTM). Training a cry-translation algorithm by evaluating >6000 acoustic features to predict whether infant cry was due to a pain (vaccinations, ear-piercings), fussy, or hunger states. Using the algorithm to predict the behavioral state of infants with reported colic. Results The cry-translation algorithm was 90.7% accurate for identifying pain cries, and achieved 71.5% accuracy in discriminating cries from fussiness, hunger, or pain. The ChatterBaby cry-translation algorithm overwhelmingly predicted that colic cries were most likely from pain, compared to fussy and hungry states. Colic cries had average pain ratings of 73%, significantly greater than the pain measurements found in fussiness and hunger (p < 0.001, 2-sample t test). Colic cries outranked pain cries by measures of acoustic intensity, including energy, length of voiced periods, and fundamental frequency/pitch, while fussy and hungry cries showed reduced intensity measures compared to pain and colic. Conclusions Acoustic features of cries are consistent across a diverse infant population and can be utilized as objective markers of pain, hunger, and fussiness. The ChatterBaby algorithm detected significant acoustic similarities between colic and painful cries, suggesting that they may share a neuronal pathway.


Sign in / Sign up

Export Citation Format

Share Document