Complex-valued Hopfield neural networks with real weights in synchronous mode

2021 ◽  
Vol 423 ◽  
pp. 535-540
Author(s):  
Masaki Kobayashi
2020 ◽  
Vol 32 (11) ◽  
pp. 2237-2248
Author(s):  
Masaki Kobayashi

A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.


2020 ◽  
Vol 32 (9) ◽  
pp. 1685-1696
Author(s):  
Masaki Kobayashi

For most multistate Hopfield neural networks, the stability conditions in asynchronous mode are known, whereas those in synchronous mode are not. If they were to converge in synchronous mode, recall would be accelerated by parallel processing. Complex-valued Hopfield neural networks (CHNNs) with a projection rule do not converge in synchronous mode. In this work, we provide stability conditions for hyperbolic Hopfield neural networks (HHNNs) in synchronous mode instead of CHNNs. HHNNs provide better noise tolerance than CHNNs. In addition, the stability conditions are applied to the projection rule, and HHNNs with a projection rule converge in synchronous mode. By computer simulations, we find that the projection rule for HHNNs in synchronous mode maintains a high noise tolerance.


2021 ◽  
pp. 1-15
Author(s):  
Masaki Kobayashi

Abstract A complex-valued Hopfield neural network (CHNN) is a multistate Hopfield model. A quaternion-valued Hopfield neural network (QHNN) with a twin-multistate activation function was proposed to reduce the number of weight parameters of CHNN. Dual connections (DCs) are introduced to the QHNNs to improve the noise tolerance. The DCs take advantage of the noncommutativity of quaternions and consist of two weights between neurons. A QHNN with DCs provides much better noise tolerance than a CHNN. Although a CHNN and a QHNN with DCs have the samenumber of weight parameters, the storage capacity of projection rule for QHNNs with DCs is half of that for CHNNs and equals that of conventional QHNNs. The small storage capacity of QHNNs with DCs is caused by projection rule, not the architecture. In this work, the ebbian rule is introduced and proved by stochastic analysis that the storage capacity of a QHNN with DCs is 0.8 times as many as that of a CHNN.


2021 ◽  
pp. 1-19
Author(s):  
Masaki Kobayashi

Multistate Hopfield models, such as complex-valued Hopfield neural networks (CHNNs), have been used as multistate neural associative memories. Quaternion-valued Hopfield neural networks (QHNNs) reduce the number of weight parameters of CHNNs. The CHNNs and QHNNs have weak noise tolerance by the inherent property of rotational invariance. Klein Hopfield neural networks (KHNNs) improve the noise tolerance by resolving rotational invariance. However, the KHNNs have another disadvantage of self-feedback, a major factor of deterioration in noise tolerance. In this work, the stability conditions of KHNNs are extended. Moreover, the projection rule for KHNNs is modified using the extended conditions. The proposed projection rule improves the noise tolerance by a reduction in self-feedback. Computer simulations support that the proposed projection rule improves the noise tolerance of KHNNs.


Author(s):  
K. Anitha ◽  
R. Dhanalakshmi ◽  
K. Naresh ◽  
D. Rukmani Devi

Neural networks play a significant role in data classification. Complex-valued Hopfield Neural Network (CHNN) is mostly used in various fields including the image classification. Though CHNN has proven its credibility in the classification task, it has a few issues. Activation function of complex-valued neuron maps to a unit circle in the complex plane affecting the resolution factor, flexibility and compatibility to changes, during adaptation in retrieval systems. The proposed work demonstrates Content-Based Image Retrieval System (CBIR) with Hyperbolic Hopfield Neural Networks (HHNN), an analogue of CHNN for classifying images. Activation function of the Hyperbolic neuron is not cyclic in hyperbolic plane. The images are mathematically represented and indexed using the six basic features. The proposed HHNN classifier is trained, tested and evaluated through extensive experiments considering individual features and four combined features for indexing. The obtained results prove that HHNN guides retrieval process, enhances system performance and minimizes the cost of implementing Neural Network Classifier-based image retrieval system.


2018 ◽  
Vol 2018 ◽  
pp. 1-5 ◽  
Author(s):  
Masaki Kobayashi

A twin-multistate quaternion Hopfield neural network (TMQHNN) is a multistate Hopfield model and can store multilevel information, such as image data. Storage capacity is an important problem of Hopfield neural networks. Jankowski et al. approximated the crosstalk terms of complex-valued Hopfield neural networks (CHNNs) by the 2-dimensional normal distributions and evaluated their storage capacities. In this work, we evaluate the storage capacities of TMQHNNs based on their idea.


Author(s):  
Zengyun Wang ◽  
Jinde Cao ◽  
Zhenyuan Guo ◽  
Lihong Huang

Some dynamical behaviours of discontinuous complex-valued Hopfield neural networks are discussed in this paper. First, we introduce a method to construct the complex-valued set-valued mapping and define some basic definitions for discontinuous complex-valued differential equations. In addition, Leray–Schauder alternative theorem is used to analyse the equilibrium existence of the networks. Lastly, we present the dynamical behaviours, including global stability and convergence in measure for discontinuous complex-valued neural networks (CVNNs) via differential inclusions. The main contribution of this paper is that we extend previous studies on continuous CVNNs to discontinuous ones. Several simulations are given to substantiate the correction of the proposed results.


Sign in / Sign up

Export Citation Format

Share Document