Hopfield neural network in magnetic textures with intrinsic Hebbian learning

2021 ◽  
Vol 104 (18) ◽  
Author(s):  
Weichao Yu ◽  
Jiang Xiao ◽  
Gerrit E. W. Bauer
2021 ◽  
Vol 15 ◽  
Author(s):  
Corentin Delacour ◽  
Aida Todri-Sanial

Oscillatory Neural Network (ONN) is an emerging neuromorphic architecture with oscillators representing neurons and information encoded in oscillator's phase relations. In an ONN, oscillators are coupled with electrical elements to define the network's weights and achieve massive parallel computation. As the weights preserve the network functionality, mapping weights to coupling elements plays a crucial role in ONN performance. In this work, we investigate relaxation oscillators based on VO2 material, and we propose a methodology to map Hebbian coefficients to ONN coupling resistances, allowing a large-scale ONN design. We develop an analytical framework to map weight coefficients into coupling resistor values to analyze ONN architecture performance. We report on an ONN with 60 fully-connected oscillators that perform pattern recognition as a Hopfield Neural Network.


2001 ◽  
Vol 6 (2) ◽  
pp. 129-136 ◽  
Author(s):  
Jiyang Dong ◽  
Shenchu Xu ◽  
Zhenxiang Chen ◽  
Boxi Wu

Discrete Hopfield neural network (DHNN) is studied by performing permutation operations on the synaptic weight matrix. The storable patterns set stored with Hebbian learning algorithm in a network without losing memories is studied, and a condition which makes sure all the patterns of the storable patterns set have a same basin size of attraction is proposed. Then, the permutation symmetries of the network are studied associating with the stored patterns set. A construction of the storable patterns set satisfying that condition is achieved by consideration of their invariance under a point group.


2011 ◽  
Vol 225-226 ◽  
pp. 479-482
Author(s):  
Min Xia ◽  
Ying Cao Zhang ◽  
Xiao Ling Ye

Nonlinear function constitution and dynamic synapses, against spurious state for Hopfield neural network are proposed. The model of the dynamical connection weight and the updating scheme of the states of neurons are given. Nonlinear function constitution improves the conventional Hebbian learning rule with linear outer product method. Simulation results show that both nonlinear function constitution and dynamic synapses can effectively increase the ability of error tolerance; furthermore, associative memory of neural network with the new method can both enlarge attractive basin and increase storage capacity.


2009 ◽  
Vol 29 (4) ◽  
pp. 1028-1031
Author(s):  
Wei-xin GAO ◽  
Xiang-yang MU ◽  
Nan TANG ◽  
Hong-liang YAN

2006 ◽  
Vol 13B (3) ◽  
pp. 323-328
Author(s):  
Yukhuu Ankhbayar ◽  
Suk-Hyung Hwang ◽  
Young-Sup Hwang

Sign in / Sign up

Export Citation Format

Share Document