scholarly journals Hebbian Learning Rule Restraining Catastrophic Forgetting in Pulse Neural Network

2003 ◽  
Vol 123 (6) ◽  
pp. 1124-1133 ◽  
Author(s):  
Makoto Motoki ◽  
Tomoki Hamagami ◽  
Seiichi Koakutsu ◽  
Hironori Hirata
2005 ◽  
Vol 151 (3) ◽  
pp. 50-60
Author(s):  
Makoto Motoki ◽  
Tomoki Hamagami ◽  
Seiichi Koakutsu ◽  
Hironori Hirata

2011 ◽  
Vol 225-226 ◽  
pp. 479-482
Author(s):  
Min Xia ◽  
Ying Cao Zhang ◽  
Xiao Ling Ye

Nonlinear function constitution and dynamic synapses, against spurious state for Hopfield neural network are proposed. The model of the dynamical connection weight and the updating scheme of the states of neurons are given. Nonlinear function constitution improves the conventional Hebbian learning rule with linear outer product method. Simulation results show that both nonlinear function constitution and dynamic synapses can effectively increase the ability of error tolerance; furthermore, associative memory of neural network with the new method can both enlarge attractive basin and increase storage capacity.


2019 ◽  
Vol 6 (4) ◽  
pp. 181098 ◽  
Author(s):  
Le Zhao ◽  
Jie Xu ◽  
Xiantao Shang ◽  
Xue Li ◽  
Qiang Li ◽  
...  

Non-volatile memristors are promising for future hardware-based neurocomputation application because they are capable of emulating biological synaptic functions. Various material strategies have been studied to pursue better device performance, such as lower energy cost, better biological plausibility, etc. In this work, we show a novel design for non-volatile memristor based on CoO/Nb:SrTiO 3 heterojunction. We found the memristor intrinsically exhibited resistivity switching behaviours, which can be ascribed to the migration of oxygen vacancies and charge trapping and detrapping at the heterojunction interface. The carrier trapping/detrapping level can be finely adjusted by regulating voltage amplitudes. Gradual conductance modulation can therefore be realized by using proper voltage pulse stimulations. And the spike-timing-dependent plasticity, an important Hebbian learning rule, has been implemented in the device. Our results indicate the possibility of achieving artificial synapses with CoO/Nb:SrTiO 3 heterojunction. Compared with filamentary type of the synaptic device, our device has the potential to reduce energy consumption, realize large-scale neuromorphic system and work more reliably, since no structural distortion occurs.


1989 ◽  
Vol 03 (07) ◽  
pp. 555-560 ◽  
Author(s):  
M.V. TSODYKS

We consider the Hopfield model with the most simple form of the Hebbian learning rule, when only simultaneous activity of pre- and post-synaptic neurons leads to modification of synapse. An extra inhibition proportional to full network activity is needed. Both symmetric nondiluted and asymmetric diluted networks are considered. The model performs well at extremely low level of activity p<K−1/2, where K is the mean number of synapses per neuron.


1996 ◽  
Vol 8 (3) ◽  
pp. 545-566 ◽  
Author(s):  
Christopher W. Lee ◽  
Bruno A. Olshausen

An intrinsic limitation of linear, Hebbian networks is that they are capable of learning only from the linear pairwise correlations within an input stream. To explore what higher forms of structure could be learned with a nonlinear Hebbian network, we constructed a model network containing a simple form of nonlinearity and we applied it to the problem of learning to detect the disparities present in random-dot stereograms. The network consists of three layers, with nonlinear sigmoidal activation functions in the second-layer units. The nonlinearities allow the second layer to transform the pixel-based representation in the input layer into a new representation based on coupled pairs of left-right inputs. The third layer of the network then clusters patterns occurring on the second-layer outputs according to their disparity via a standard competitive learning rule. Analysis of the network dynamics shows that the second-layer units' nonlinearities interact with the Hebbian learning rule to expand the region over which pairs of left-right inputs are stable. The learning rule is neurobiologically inspired and plausible, and the model may shed light on how the nervous system learns to use coincidence detection in general.


Sign in / Sign up

Export Citation Format

Share Document