scholarly journals A Bayesian perspective on the ring attractor for heading-direction tracking in the Drosophila central complex

2021 ◽  
Author(s):  
Anna Kutschireiter ◽  
Melanie A Basnak ◽  
Rachel I Wilson ◽  
Jan Drugowitsch

Efficient navigation requires animals to track their position, velocity and heading direction (HD). Bayesian inference provides a principled framework for estimating these quantities from unreliable sensory observations, yet little is known about how and where Bayesian algorithms could be implemented in the brain's neural networks. Here, we propose a class of recurrent neural networks that track both a dynamic HD estimate and its associated uncertainty. They do so according to a circular Kalman filter, a statistically optimal algorithm for circular estimation. Our network generalizes standard ring attractor models by encoding uncertainty in the amplitude of a bump of neural activity. More generally, we show that near-Bayesian integration is inherent in ring attractor networks, as long as their connectivity strength allows them to sufficiently deviate from the attractor state. Furthermore, we identified the basic network motifs that are required to implement Bayesian inference, and show that these motifs are present in the Drosophila HD system connectome. Overall, our work demonstrates that the Drosophila HD system can in principle implement a dynamic Bayesian inference algorithm in a biologically plausible manner, consistent with recent findings that suggest ring-attractor dynamics underlie the Drosophila HD system.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Tom Struck ◽  
Javed Lindner ◽  
Arne Hollmann ◽  
Floyd Schauer ◽  
Andreas Schmidbauer ◽  
...  

AbstractEstablishing low-error and fast detection methods for qubit readout is crucial for efficient quantum error correction. Here, we test neural networks to classify a collection of single-shot spin detection events, which are the readout signal of our qubit measurements. This readout signal contains a stochastic peak, for which a Bayesian inference filter including Gaussian noise is theoretically optimal. Hence, we benchmark our neural networks trained by various strategies versus this latter algorithm. Training of the network with 106 experimentally recorded single-shot readout traces does not improve the post-processing performance. A network trained by synthetically generated measurement traces performs similar in terms of the detection error and the post-processing speed compared to the Bayesian inference filter. This neural network turns out to be more robust to fluctuations in the signal offset, length and delay as well as in the signal-to-noise ratio. Notably, we find an increase of 7% in the visibility of the Rabi oscillation when we employ a network trained by synthetic readout traces combined with measured signal noise of our setup. Our contribution thus represents an example of the beneficial role which software and hardware implementation of neural networks may play in scalable spin qubit processor architectures.


eLife ◽  
2020 ◽  
Vol 9 ◽  
Author(s):  
Kevin A Bolding ◽  
Shivathmihai Nagappan ◽  
Bao-Xia Han ◽  
Fan Wang ◽  
Kevin M Franks

Pattern completion, or the ability to retrieve stable neural activity patterns from noisy or partial cues, is a fundamental feature of memory. Theoretical studies indicate that recurrently connected auto-associative or discrete attractor networks can perform this process. Although pattern completion and attractor dynamics have been observed in various recurrent neural circuits, the role recurrent circuitry plays in implementing these processes remains unclear. In recordings from head-fixed mice, we found that odor responses in olfactory bulb degrade under ketamine/xylazine anesthesia while responses immediately downstream, in piriform cortex, remain robust. Recurrent connections are required to stabilize cortical odor representations across states. Moreover, piriform odor representations exhibit attractor dynamics, both within and across trials, and these are also abolished when recurrent circuitry is eliminated. Here, we present converging evidence that recurrently-connected piriform populations stabilize sensory representations in response to degraded inputs, consistent with an auto-associative function for piriform cortex supported by recurrent circuitry.


2011 ◽  
Vol 74 (17) ◽  
pp. 2716-2724 ◽  
Author(s):  
Louiza Dehyadegary ◽  
Seyyed Ali Seyyedsalehi ◽  
Isar Nejadgholi

Author(s):  
Pavel Kikin ◽  
Alexey Kolesnikov ◽  
Alexey Portnov ◽  
Denis Grischenko

The state of ecological systems, along with their general characteristics, is almost always described by indicators that vary in space and time, which leads to a significant complication of constructing mathematical models for predicting the state of such systems. One of the ways to simplify and automate the construction of mathematical models for predicting the state of such systems is the use of machine learning methods. The article provides a comparison of traditional and based on neural networks, algorithms and machine learning methods for predicting spatio-temporal series representing ecosystem data. Analysis and comparison were carried out among the following algorithms and methods: logistic regression, random forest, gradient boosting on decision trees, SARIMAX, neural networks of long-term short-term memory (LSTM) and controlled recurrent blocks (GRU). To conduct the study, data sets were selected that have both spatial and temporal components: the values of the number of mosquitoes, the number of dengue infections, the physical condition of tropical grove trees, and the water level in the river. The article discusses the necessary steps for preliminary data processing, depending on the algorithm used. Also, Kolmogorov complexity was calculated as one of the parameters that can help formalize the choice of the most optimal algorithm when constructing mathematical models of spatio-temporal data for the sets used. Based on the results of the analysis, recommendations are given on the application of certain methods and specific technical solutions, depending on the characteristics of the data set that describes a particular ecosystem


2019 ◽  
Author(s):  
Ioannis Pisokas ◽  
Stanley Heinze ◽  
Barbara Webb

AbstractRecent studies of the Central Complex in the brain of the fruit fly have identified neurons with activity that tracks the animal’s heading direction. These neurons are part of a neuronal circuit with dynamics resembling those of a ring attractor. Other insects have a homologous circuit sharing a generally similar topographic structure but with significant structural and connectivity differences. We model the connectivity patterns in two insect species to investigate the effect of the differences on the dynamics of the circuit. We illustrate that the circuit found in locusts can also operate as a ring attractor and identify differences that enable the fruit fly circuit to respond faster to heading changes while they render the locust circuit more tolerant to noise. Our findings demonstrate that subtle differences in neuronal projection patterns can have a significant effect on the circuit performance and emphasise the need for a comparative approach in neuroscience.


Author(s):  
Yunpeng Chen ◽  
Xiaojie Jin ◽  
Bingyi Kang ◽  
Jiashi Feng ◽  
Shuicheng Yan

The residual unit and its variations are wildly used in building very deep neural networks for alleviating optimization difficulty. In this work, we revisit the standard residual function as well as its several successful variants and propose a unified framework based on tensor Block Term Decomposition (BTD) to explain these apparently different residual functions from the tensor decomposition view. With the BTD framework, we further propose a novel basic network architecture, named the Collective Residual Unit (CRU). CRU further enhances parameter efficiency of deep residual neural networks by sharing core factors derived from collective tensor factorization over the involved residual units. It enables efficient knowledge sharing across multiple residual units, reduces the number of model parameters, lowers the risk of over-fitting, and provides better generalization ability. Extensive experimental results show that our proposed CRU network brings outstanding parameter efficiency -- it achieves comparable classification performance with ResNet-200 while using a model size as small as ResNet-50 on the ImageNet-1k and Places365-Standard benchmark datasets.


2014 ◽  
pp. 30-34
Author(s):  
Vladimir Golovko

This paper discusses the neural network approach for computing of Lyapunov spectrum using one dimensional time series from unknown dynamical system. Such an approach is based on the reconstruction of attractor dynamics and applying of multilayer perceptron (MLP) for forecasting the next state of dynamical system from the previous one. It allows for evaluating the Lyapunov spectrum of unknown dynamical system accurately and efficiently only by using one observation. The results of experiments are discussed.


2019 ◽  
Vol 10 (36) ◽  
pp. 8438-8446 ◽  
Author(s):  
Seongok Ryu ◽  
Yongchan Kwon ◽  
Woo Youn Kim

Deep neural networks have been increasingly used in various chemical fields. Here, we show that Bayesian inference enables more reliable prediction with quantitative uncertainty analysis.


Sign in / Sign up

Export Citation Format

Share Document