Hyperbolic-Valued Hopfield Neural Networks in Synchronous Mode

2020 ◽  
Vol 32 (9) ◽  
pp. 1685-1696
Author(s):  
Masaki Kobayashi

For most multistate Hopfield neural networks, the stability conditions in asynchronous mode are known, whereas those in synchronous mode are not. If they were to converge in synchronous mode, recall would be accelerated by parallel processing. Complex-valued Hopfield neural networks (CHNNs) with a projection rule do not converge in synchronous mode. In this work, we provide stability conditions for hyperbolic Hopfield neural networks (HHNNs) in synchronous mode instead of CHNNs. HHNNs provide better noise tolerance than CHNNs. In addition, the stability conditions are applied to the projection rule, and HHNNs with a projection rule converge in synchronous mode. By computer simulations, we find that the projection rule for HHNNs in synchronous mode maintains a high noise tolerance.

2020 ◽  
Vol 32 (11) ◽  
pp. 2237-2248
Author(s):  
Masaki Kobayashi

A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.


2021 ◽  
pp. 1-19
Author(s):  
Masaki Kobayashi

Multistate Hopfield models, such as complex-valued Hopfield neural networks (CHNNs), have been used as multistate neural associative memories. Quaternion-valued Hopfield neural networks (QHNNs) reduce the number of weight parameters of CHNNs. The CHNNs and QHNNs have weak noise tolerance by the inherent property of rotational invariance. Klein Hopfield neural networks (KHNNs) improve the noise tolerance by resolving rotational invariance. However, the KHNNs have another disadvantage of self-feedback, a major factor of deterioration in noise tolerance. In this work, the stability conditions of KHNNs are extended. Moreover, the projection rule for KHNNs is modified using the extended conditions. The proposed projection rule improves the noise tolerance by a reduction in self-feedback. Computer simulations support that the proposed projection rule improves the noise tolerance of KHNNs.


2021 ◽  
pp. 1-11
Author(s):  
Masaki Kobayashi

Hopfield neural networks have been extended using hypercomplex numbers. The algebra of bicomplex numbers, also referred to as commutative quaternions, is a number system of dimension 4. Since the multiplication is commutative, many notions and theories of linear algebra, such as determinant, are available, unlike quaternions. A bicomplex-valued Hopfield neural network (BHNN) has been proposed as a multistate neural associative memory. However, the stability conditions have been insufficient for the projection rule. In this work, the stability conditions are extended and applied to improvement of the projection rule. The computer simulations suggest improved noise tolerance.


2021 ◽  
pp. 1-15
Author(s):  
Masaki Kobayashi

Abstract A complex-valued Hopfield neural network (CHNN) is a multistate Hopfield model. A quaternion-valued Hopfield neural network (QHNN) with a twin-multistate activation function was proposed to reduce the number of weight parameters of CHNN. Dual connections (DCs) are introduced to the QHNNs to improve the noise tolerance. The DCs take advantage of the noncommutativity of quaternions and consist of two weights between neurons. A QHNN with DCs provides much better noise tolerance than a CHNN. Although a CHNN and a QHNN with DCs have the samenumber of weight parameters, the storage capacity of projection rule for QHNNs with DCs is half of that for CHNNs and equals that of conventional QHNNs. The small storage capacity of QHNNs with DCs is caused by projection rule, not the architecture. In this work, the ebbian rule is introduced and proved by stochastic analysis that the storage capacity of a QHNN with DCs is 0.8 times as many as that of a CHNN.


2009 ◽  
pp. 236-255 ◽  
Author(s):  
Donq-Liang Lee

New design methods for the complex-valued multistate Hopfield associative memories (CVHAMs) are presented. The author of this chapter shows that the well-known projection rule can be generalized to complex domain such that the weight matrix of the CVHAM can be designed by using the generalized inverse technique. The stability of the presented CVHAM is analyzed by using energy function approach which shows that in synchronous update mode a CVHAM is guaranteed to converge to a fixed point from any given initial state. Moreover, the projection geometry of the generalized projection rule is discussed. In order to enhance the recall capability, a strategy of eliminating the spurious memories is reported. Next, a generalized intraconnected bidirectional associative memory (GIBAM) is introduced. A GIBAM is a complex generalization of the intraconnected BAM (IBAM). Lee shows that the design of the GIBAM can also be accomplished by using the generalized inverse technique. Finally, the validity and the performance of the introduced methods are investigated by computer simulation.


2017 ◽  
Vol 2017 ◽  
pp. 1-6 ◽  
Author(s):  
Masaki Kobayashi

Many models of neural networks have been extended to complex-valued neural networks. A complex-valued Hopfield neural network (CHNN) is a complex-valued version of a Hopfield neural network. Complex-valued neurons can represent multistates, and CHNNs are available for the storage of multilevel data, such as gray-scale images. The CHNNs are often trapped into the local minima, and their noise tolerance is low. Lee improved the noise tolerance of the CHNNs by detecting and exiting the local minima. In the present work, we propose a new recall algorithm that eliminates the local minima. We show that our proposed recall algorithm not only accelerated the recall but also improved the noise tolerance through computer simulations.


2004 ◽  
Vol 14 (05) ◽  
pp. 337-345 ◽  
Author(s):  
ZHIGANG ZENG ◽  
DE-SHUANG HUANG ◽  
ZENGFU WANG

This paper presents new theoretical results on global exponential stability of cellular neural networks with time-varying delays. The stability conditions depend on external inputs, connection weights and delays of cellular neural networks. Using these results, global exponential stability of cellular neural networks can be derived, and the estimate for location of equilibrium point can also be obtained. Finally, the simulating results demonstrate the validity and feasibility of our proposed approach.


Sign in / Sign up

Export Citation Format

Share Document