Gradient descent learning rule for complex-valued associative memories with large constant terms

2016 ◽  
Vol 11 (3) ◽  
pp. 357-363 ◽  
Author(s):  
Masaki Kobayashi
2018 ◽  
Vol 8 (3) ◽  
pp. 237-249 ◽  
Author(s):  
Teijiro Isokawa ◽  
Hiroki Yamamoto ◽  
Haruhiko Nishimura ◽  
Takayuki Yumoto ◽  
Naotake Kamiura ◽  
...  

AbstractIn this paper, we investigate the stability of patterns embedded as the associative memory distributed on the complex-valued Hopfield neural network, in which the neuron states are encoded by the phase values on a unit circle of complex plane. As learning schemes for embedding patterns onto the network, projection rule and iterative learning rule are formally expanded to the complex-valued case. The retrieval of patterns embedded by iterative learning rule is demonstrated and the stability for embedded patterns is quantitatively investigated.


2017 ◽  
Vol 7 (4) ◽  
pp. 257-264 ◽  
Author(s):  
Toshifumi Minemoto ◽  
Teijiro Isokawa ◽  
Haruhiko Nishimura ◽  
Nobuyuki Matsui

AbstractHebbian learning rule is well known as a memory storing scheme for associative memory models. This scheme is simple and fast, however, its performance gets decreased when memory patterns are not orthogonal each other. Pseudo-orthogonalization is a decorrelating method for memory patterns which uses XNOR masking between the memory patterns and randomly generated patterns. By a combination of this method and Hebbian learning rule, storage capacity of associative memory concerning non-orthogonal patterns is improved without high computational cost. The memory patterns can also be retrieved based on a simulated annealing method by using an external stimulus pattern. By utilizing complex numbers and quaternions, we can extend the pseudo-orthogonalization for complex-valued and quaternionic Hopfield neural networks. In this paper, the extended pseudo-orthogonalization methods for associative memories based on complex numbers and quaternions are examined from the viewpoint of correlations in memory patterns. We show that the method has stable recall performance on highly correlated memory patterns compared to the conventional real-valued method.


2008 ◽  
Vol 18 (02) ◽  
pp. 147-156 ◽  
Author(s):  
MASAKI KOBAYASHI

HAM (Hopfield Associative Memory) and BAM (Bidirectinal Associative Memory) are representative associative memories by neural networks. The storage capacity by the Hebb rule, which is often used, is extremely low. In order to improve it, some learning methods, for example, pseudo-inverse matrix learning and gradient descent learning, have been introduced. Oh introduced pseudo-relaxation learning algorithm to HAM and BAM. In order to accelerate it, Hattori proposed quick learning. Noest proposed CAM (Complex-valued Associative Memory), which is complex-valued HAM. The storage capacity of CAM by the Hebb rule is also extremely low. Pseudo-inverse matrix learning and gradient descent learning have already been generalized to CAM. In this paper, we apply pseudo-relaxation learning algorithm to CAM in order to improve the capacity.


1992 ◽  
Vol 4 (6) ◽  
pp. 946-957 ◽  
Author(s):  
Marcus Frean

The thermal perceptron is a simple extension to Rosenblatt's perceptron learning rule for training individual linear threshold units. It finds stable weights for nonseparable problems as well as separable ones. Experiments indicate that if a good initial setting for a temperature parameter, T0, has been found, then the thermal perceptron outperforms the Pocket algorithm and methods based on gradient descent. The learning rule stabilizes the weights (learns) over a fixed training period. For separable problems it finds separating weights much more quickly than the usual rules.


Sign in / Sign up

Export Citation Format

Share Document