scholarly journals Does the Zebra Finch Mating Song Circuit Use Spike Times Efficiently?

2021 ◽  
Author(s):  
Wilten Nicola ◽  
Claudia Clopath ◽  
Thomas Robert Newton

Precise and reliable spike times are thought to subserve multiple possible functions, including improving the accuracy of encoding stimuli or behaviours relative to other coding schemes. Indeed, repeating sequences of spikes with sub-millisecond precision exist in nature, such as the synfire chain of spikes in area HVC of the zebra-finch mating-song circuit. Here, we analyzed what impact precise and reliable spikes have on the encoding accuracy for both the zebra-finch and more generic neural circuits using computational modelling. Our results show that neural circuits can use precisely timed spikes to encode signals with a higher-order accuracy than a conventional rate code. Circuits with precisely timed and reliably emitted spikes increase their encoding accuracy linearly with network size, which is the hallmark signature of an efficient code. This qualitatively differs from circuits that employ a rate code which increase their encoding accuracy with the square-root of network size. However, this improved scaling is dependent on the spikes becoming more accurate and more reliable with larger networks. Finally, we discuss how to test this scaling relationship in the zebra mating song circuit using both neural data and song-spectrogram-based recordings while taking advantage of the natural fluctuation in HVC network size due to neurogenesis. The zebra-finch mating-song circuit may represent the most likely candidate system for the use of spike-timing-based, efficient coding strategies in nature.

2018 ◽  
Author(s):  
Dhruva V. Raman ◽  
Timothy O’Leary

AbstractHow does the size of a neural circuit influence its learning performance? Intuitively, we expect the learning capacity of a neural circuit to grow with the number of neurons and synapses. Larger brains tend to be found in species with higher cognitive function and learning ability. Similarly, adding connections and units to artificial neural networks can allow them to solve more complex tasks. However, we show that in a biologically relevant setting where synapses introduce an unavoidable amount of noise, there is an optimal size of network for a given task. Beneath this optimal size, our analysis shows how adding apparently redundant neurons and connections can make tasks more learnable. Therefore large neural circuits can either devote connectivity to generating complex behaviors, or exploit this connectivity to achieve faster and more precise learning of simpler behaviors. Above the optimal network size, the addition of neurons and synaptic connections starts to impede learning performance. This suggests that overall brain size may be constrained by the need to learn efficiently with unreliable synapses, and may explain why some neurological learning deficits are associated with hyperconnectivity. Our analysis is independent of specific learning rules and uncovers fundamental relationships between learning rate, task performance, network size and intrinsic noise in neural circuits.


2019 ◽  
Vol 116 (21) ◽  
pp. 10537-10546 ◽  
Author(s):  
Dhruva Venkita Raman ◽  
Adriana Perez Rotondo ◽  
Timothy O’Leary

How does the size of a neural circuit influence its learning performance? Larger brains tend to be found in species with higher cognitive function and learning ability. Intuitively, we expect the learning capacity of a neural circuit to grow with the number of neurons and synapses. We show how adding apparently redundant neurons and connections to a network can make a task more learnable. Consequently, large neural circuits can either devote connectivity to generating complex behaviors or exploit this connectivity to achieve faster and more precise learning of simpler behaviors. However, we show that in a biologically relevant setting where synapses introduce an unavoidable amount of noise, there is an optimal size of network for a given task. Above the optimal network size, the addition of neurons and synaptic connections starts to impede learning performance. This suggests that the size of brain circuits may be constrained by the need to learn efficiently with unreliable synapses and provides a hypothesis for why some neurological learning deficits are associated with hyperconnectivity. Our analysis is independent of specific learning rules and uncovers fundamental relationships between learning rate, task performance, network size, and intrinsic noise in neural circuits.


2018 ◽  
Author(s):  
Mark D. Humphries ◽  
Jose Obeso ◽  
Jakob Kisbye Dreyer

AbstractMovement disorders arise from the complex interplay of multiple changes to neural circuits. Successful treatments for these disorders could interact with these complex changes in myriad ways, and as a consequence their mechanisms of action and their amelioration of symptoms are incompletely understood. Using Parkinson’s disease as a case-study, we review here how computational models are a crucial tool for taming this complexity, across causative mechanisms, consequent neural dynamics, and treatments. For mechanisms, we review models that capture the effects of losing dopamine on basal ganglia function; for dynamics, we discuss models that have transformed our understanding of how beta-band (15-30 Hz) oscillations arise in the parkinsonian basal ganglia. For treatments, we touch on the breadth of computational modelling work trying to understand the therapeutic actions of deep brain stimulation. Collectively, models from across all levels of description are providing a compelling account of the causes, symptoms, and treatments for Parkinson’s disease.


2013 ◽  
Vol 25 (11) ◽  
pp. 2833-2857 ◽  
Author(s):  
John H. C. Palmer ◽  
Pulin Gong

Spike-timing-dependent plasticity (STDP) is an important synaptic dynamics that is capable of shaping the complex spatiotemporal activity of neural circuits. In this study, we examine the effects of STDP on the spatiotemporal patterns of a spatially extended, two-dimensional spiking neural circuit. We show that STDP can promote the formation of multiple, localized spiking wave patterns or multiple spike timing sequences in a broad parameter space of the neural circuit. Furthermore, we illustrate that the formation of these dynamic patterns is due to the interaction between the dynamics of ongoing patterns in the neural circuit and STDP. This interaction is analyzed by developing a simple model able to capture its essential dynamics, which give rise to symmetry breaking. This occurs in a fundamentally self-organizing manner, without fine-tuning of the system parameters. Moreover, we find that STDP provides a synaptic mechanism to learn the paths taken by spiking waves and modulate the dynamics of their interactions, enabling them to be regulated. This regulation mechanism has error-correcting properties. Our results therefore highlight the important roles played by STDP in facilitating the formation and regulation of spiking wave patterns that may have crucial functional roles in brain information processing.


2018 ◽  
Vol 89 (11) ◽  
pp. 1181-1188 ◽  
Author(s):  
Mark D Humphries ◽  
Jose Angel Obeso ◽  
Jakob Kisbye Dreyer

Movement disorders arise from the complex interplay of multiple changes to neural circuits. Successful treatments for these disorders could interact with these complex changes in myriad ways, and as a consequence their mechanisms of action and their amelioration of symptoms are incompletely understood. Using Parkinson’s disease as a case study, we review here how computational models are a crucial tool for taming this complexity, across causative mechanisms, consequent neural dynamics and treatments. For mechanisms, we review models that capture the effects of losing dopamine on basal ganglia function; for dynamics, we discuss models that have transformed our understanding of how beta-band (15–30 Hz) oscillations arise in the parkinsonian basal ganglia. For treatments, we touch on the breadth of computational modelling work trying to understand the therapeutic actions of deep brain stimulation. Collectively, models from across all levels of description are providing a compelling account of the causes, symptoms and treatments for Parkinson’s disease.


2021 ◽  
Vol 2021 ◽  
pp. 1-18
Author(s):  
Kwan Tung Li ◽  
Junhao Liang ◽  
Changsong Zhou

Gamma oscillation in neural circuits is believed to associate with effective learning in the brain, while the underlying mechanism is unclear. This paper aims to study how spike-timing-dependent plasticity (STDP), a typical mechanism of learning, with its interaction with gamma oscillation in neural circuits, shapes the network dynamics properties and the network structure formation. We study an excitatory-inhibitory (E-I) integrate-and-fire neuronal network with triplet STDP, heterosynaptic plasticity, and a transmitter-induced plasticity. Our results show that the performance of plasticity is diverse in different synchronization levels. We find that gamma oscillation is beneficial to synaptic potentiation among stimulated neurons by forming a special network structure where the sum of excitatory input synaptic strength is correlated with the sum of inhibitory input synaptic strength. The circuit can maintain E-I balanced input on average, whereas the balance is temporal broken during the learning-induced oscillations. Our study reveals a potential mechanism about the benefits of gamma oscillation on learning in biological neural circuits.


2014 ◽  
Vol 15 (Suppl 1) ◽  
pp. P90
Author(s):  
Bolun Chen ◽  
Jan R Engelbrecht ◽  
Renato Mirollo

Sign in / Sign up

Export Citation Format

Share Document