Efficient implementation of synaptic learning rules for neuromorphic computing based on plasma-treated ZnO nanowire memristors

2019 ◽  
Vol 53 (5) ◽  
pp. 055303
Author(s):  
Jiandong Wan ◽  
Wenbiao Qiu ◽  
Yunfeng Lai ◽  
Peijie Lin ◽  
Qiao Zheng ◽  
...  
2020 ◽  
Author(s):  
Ning Yue ◽  
Lai Yunfeng ◽  
Wan Jiandong ◽  
Cheng Shuying ◽  
Zheng Qiao ◽  
...  

1992 ◽  
Vol 4 (5) ◽  
pp. 691-702 ◽  
Author(s):  
Ralph Linsker

A network that develops to maximize the mutual information between its output and the signal portion of its input (which is admixed with noise) is useful for extracting salient input features, and may provide a model for aspects of biological neural network function. I describe a local synaptic Learning rule that performs stochastic gradient ascent in this information-theoretic quantity, for the case in which the input-output mapping is linear and the input signal and noise are multivariate gaussian. Feedforward connection strengths are modified by a Hebbian rule during a "learning" phase in which examples of input signal plus noise are presented to the network, and by an anti-Hebbian rule during an "unlearning" phase in which examples of noise alone are presented. Each recurrent lateral connection has two values of connection strength, one for each phase; these values are updated by an anti-Hebbian rule.


Sign in / Sign up

Export Citation Format

Share Document