synaptic learning
Recently Published Documents


TOTAL DOCUMENTS

102
(FIVE YEARS 37)

H-INDEX

19
(FIVE YEARS 3)

Author(s):  
Mohit Kumar Gautam ◽  
Sanjay Kumar ◽  
Shaibal Mukherjee

Abstract Here, we report a fabrication of Y2O3-based memristive crossbar array along with an analytical model to evaluate the performance of such memristive array system to understand the forgetting and retention behavior in the neuromorphic computation. The developed analytical model is able to simulate the highly-dense memristive crossbar array based neural network of biological synapses. These biological synapses control the communication efficiency between neurons and can implement the learning capability of the neurons. During electrical stimulation of the memristive devices, the memory transition is exhibited along with the number of applied voltage pulses which is analogous to the real human brain functionality. Further, to obtain the forgetting and retention behavior of the memristive devices, a modified window function equation is proposed by incorporating two novel internal state variables in the form of forgetting rate and retention. The obtained results confirm that the effect of variation in electrical stimuli on forgetting and retention as similar to the biological brain. Therefore, the developed analytical memristive model further can be utilized in the memristive system to develop real-world applications in neuromorphic domains.


2021 ◽  
Vol 118 (51) ◽  
pp. e2111821118
Author(s):  
Yuhan Helena Liu ◽  
Stephen Smith ◽  
Stefan Mihalas ◽  
Eric Shea-Brown ◽  
Uygar Sümbül

Brains learn tasks via experience-driven differential adjustment of their myriad individual synaptic connections, but the mechanisms that target appropriate adjustment to particular connections remain deeply enigmatic. While Hebbian synaptic plasticity, synaptic eligibility traces, and top-down feedback signals surely contribute to solving this synaptic credit-assignment problem, alone, they appear to be insufficient. Inspired by new genetic perspectives on neuronal signaling architectures, here, we present a normative theory for synaptic learning, where we predict that neurons communicate their contribution to the learning outcome to nearby neurons via cell-type–specific local neuromodulation. Computational tests suggest that neuron-type diversity and neuron-type–specific local neuromodulation may be critical pieces of the biological credit-assignment puzzle. They also suggest algorithms for improved artificial neural network learning efficiency.


2021 ◽  
Author(s):  
Mike Gilbert

AbstractThis paper presents a model of learning by the cerebellar circuit. In the traditional and dominant learning model, training teaches finely graded parallel fibre synaptic weights which modify transmission to Purkinje cells and to interneurons that inhibit Purkinje cells. Following training, input in a learned pattern drives a training-modified response. The function is that the naive response to input rates is displaced by a learned one, trained under external supervision. In the proposed model, there is no weight-controlled graduated balance of excitation and inhibition of Purkinje cells. Instead, the balance has two functional states—a switch—at synaptic, whole cell and microzone level. The paper is in two parts. The first is a detailed physiological argument for the synaptic learning function. The second uses the function in a computational simulation of pattern memory. Against expectation, this generates a predictable outcome from input chaos (real-world variables). Training always forces synaptic weights away from the middle and towards the limits of the range, causing them to polarise, so that transmission is either robust or blocked. All conditions teach the same outcome, such that all learned patterns receive the same, rather than a bespoke, effect on transmission. In this model, the function of learning is gating—that is, to select patterns that trigger output merely, and not to modify output. The outcome is memory-operated gate activation which operates a two-state balance of weight-controlled transmission. Group activity of parallel fibres also simultaneously contains a second code contained in collective rates, which varies independently of the pattern code. A two-state response to the pattern code allows faithful, and graduated, control of Purkinje cell firing by the rate code, at gated times.


2021 ◽  
Vol 15 ◽  
Author(s):  
Rounak Chatterjee ◽  
Janet L. Paluh ◽  
Souradeep Chowdhury ◽  
Soham Mondal ◽  
Arnab Raha ◽  
...  

Synaptic function and experience-dependent plasticity across multiple synapses are dependent on the types of neurons interacting as well as the intricate mechanisms that operate at the molecular level of the synapse. To understand the complexity of information processing at synaptic networks will rely in part on effective computational models. Such models should also evaluate disruptions to synaptic function by multiple mechanisms. By co-development of algorithms alongside hardware, real time analysis metrics can be co-prioritized along with biological complexity. The hippocampus is implicated in autism spectrum disorders (ASD) and within this region glutamatergic neurons constitute 90% of the neurons integral to the functioning of neuronal networks. Here we generate a computational model referred to as ASD interrogator (ASDint) and corresponding hardware to enable in silicon analysis of multiple ASD mechanisms affecting glutamatergic neuron synapses. The hardware architecture Synaptic Neuronal Circuit, SyNC, is a novel GPU accelerator or neural net, that extends discovery by acting as a biologically relevant realistic neuron synapse in real time. Co-developed ASDint and SyNC expand spiking neural network models of plasticity to comparative analysis of retrograde messengers. The SyNC model is realized in an ASIC architecture, which enables the ability to compute increasingly complex scenarios without sacrificing area efficiency of the model. Here we apply the ASDint model to analyse neuronal circuitry dysfunctions associated with autism spectral disorder (ASD) synaptopathies and their effects on the synaptic learning parameter and demonstrate SyNC on an ideal ASDint scenario. Our work highlights the value of secondary pathways in regard to evaluating complex ASD synaptopathy mechanisms. By comparing the degree of variation in the synaptic learning parameter to the response obtained from simulations of the ideal scenario we determine the potency and time of the effect of a particular evaluated mechanism. Hence simulations of such scenarios in even a small neuronal network now allows us to identify relative impacts of changed parameters and their effect on synaptic function. Based on this, we can estimate the minimum fraction of a neuron exhibiting a particular dysfunction scenario required to lead to complete failure of a neural network to coordinate pre-synaptic and post-synaptic outputs.


2021 ◽  
Author(s):  
James B Priestley ◽  
John C Bowler ◽  
Sebi V Rolotti ◽  
Stefano Fusi ◽  
Attila Losonczy

Neurons in the hippocampus exhibit striking selectivity for specific combinations of sensory features, forming representations which are thought to subserve episodic memory. Even during a completely novel experience, ensembles of hippocampal ``place cells'' are rapidly configured such that the population sparsely encodes visited locations, stabilizing within minutes of the first exposure to a new environment. What cellular mechanisms enable this fast encoding of experience? Here we leverage virtual reality and large scale neural recordings to dissect the effects of novelty and experience on the dynamics of place field formation. We show that the place fields of many CA1 neurons transiently shift locations and modulate the amplitude of their activity immediately after place field formation, consistent with rapid plasticity mechanisms driven by plateau potentials and somatic burst spiking. These motifs were particularly enriched during initial exploration of a novel context and decayed with experience. Our data suggest that novelty modulates the effective learning rate in CA1, favoring burst-driven field formation to support fast synaptic updating during new experience.


eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Eric Torsten Reifenstein ◽  
Ikhwan Bin Khalid ◽  
Richard Kempter

Remembering the temporal order of a sequence of events is a task easily performed by humans in everyday life, but the underlying neuronal mechanisms are unclear. This problem is particularly intriguing as human behavior often proceeds on a time scale of seconds, which is in stark contrast to the much faster millisecond time-scale of neuronal processing in our brains. One long-held hypothesis in sequence learning suggests that a particular temporal fine-structure of neuronal activity - termed 'phase precession' - enables the compression of slow behavioral sequences down to the fast time scale of the induction of synaptic plasticity. Using mathematical analysis and computer simulations, we find that - for short enough synaptic learning windows - phase precession can improve temporal-order learning tremendously and that the asymmetric part of the synaptic learning window is essential for temporal-order learning. To test these predictions, we suggest experiments that selectively alter phase precession or the learning window and evaluate memory of temporal order.


2021 ◽  
Vol 12 (14) ◽  
pp. 3600-3606
Author(s):  
Hengjie Zhang ◽  
Chuantong Cheng ◽  
Beiju Huang ◽  
Huan Zhang ◽  
Run Chen ◽  
...  

2021 ◽  
Vol 118 (10) ◽  
pp. 103502
Author(s):  
Ya Lin ◽  
Jilin Liu ◽  
Jiajuan Shi ◽  
Tao Zeng ◽  
Xuanyu Shan ◽  
...  

2021 ◽  
Vol 11 ◽  
pp. 1100-1110
Author(s):  
Trishala R. Desai ◽  
Tukaram D. Dongale ◽  
Swapnil R. Patil ◽  
Arpita Pandey Tiwari ◽  
Pankaj K. Pawar ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document