scholarly journals Fast Recall for Complex-Valued Hopfield Neural Networks with Projection Rules

2017 ◽  
Vol 2017 ◽  
pp. 1-6 ◽  
Author(s):  
Masaki Kobayashi

Many models of neural networks have been extended to complex-valued neural networks. A complex-valued Hopfield neural network (CHNN) is a complex-valued version of a Hopfield neural network. Complex-valued neurons can represent multistates, and CHNNs are available for the storage of multilevel data, such as gray-scale images. The CHNNs are often trapped into the local minima, and their noise tolerance is low. Lee improved the noise tolerance of the CHNNs by detecting and exiting the local minima. In the present work, we propose a new recall algorithm that eliminates the local minima. We show that our proposed recall algorithm not only accelerated the recall but also improved the noise tolerance through computer simulations.

2020 ◽  
Vol 32 (11) ◽  
pp. 2237-2248
Author(s):  
Masaki Kobayashi

A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.


2021 ◽  
pp. 1-15
Author(s):  
Masaki Kobayashi

Abstract A complex-valued Hopfield neural network (CHNN) is a multistate Hopfield model. A quaternion-valued Hopfield neural network (QHNN) with a twin-multistate activation function was proposed to reduce the number of weight parameters of CHNN. Dual connections (DCs) are introduced to the QHNNs to improve the noise tolerance. The DCs take advantage of the noncommutativity of quaternions and consist of two weights between neurons. A QHNN with DCs provides much better noise tolerance than a CHNN. Although a CHNN and a QHNN with DCs have the samenumber of weight parameters, the storage capacity of projection rule for QHNNs with DCs is half of that for CHNNs and equals that of conventional QHNNs. The small storage capacity of QHNNs with DCs is caused by projection rule, not the architecture. In this work, the ebbian rule is introduced and proved by stochastic analysis that the storage capacity of a QHNN with DCs is 0.8 times as many as that of a CHNN.


2018 ◽  
Vol 2018 ◽  
pp. 1-5 ◽  
Author(s):  
Masaki Kobayashi

A twin-multistate quaternion Hopfield neural network (TMQHNN) is a multistate Hopfield model and can store multilevel information, such as image data. Storage capacity is an important problem of Hopfield neural networks. Jankowski et al. approximated the crosstalk terms of complex-valued Hopfield neural networks (CHNNs) by the 2-dimensional normal distributions and evaluated their storage capacities. In this work, we evaluate the storage capacities of TMQHNNs based on their idea.


2013 ◽  
Vol 2013 ◽  
pp. 1-9 ◽  
Author(s):  
Xia Huang ◽  
Zhen Wang ◽  
Yuxia Li

A fractional-order two-neuron Hopfield neural network with delay is proposed based on the classic well-known Hopfield neural networks, and further, the complex dynamical behaviors of such a network are investigated. A great variety of interesting dynamical phenomena, including single-periodic, multiple-periodic, and chaotic motions, are found to exist. The existence of chaotic attractors is verified by the bifurcation diagram and phase portraits as well.


2018 ◽  
Vol 29 (08) ◽  
pp. 1850076
Author(s):  
Jiandu Liu ◽  
Bokui Chen ◽  
Dengcheng Yan ◽  
Lei Wang

Calculating the exact number of fixed points and attractors of an arbitrary Hopfield neural network is a non-deterministic polynomial (NP)-hard problem. In this paper, we first calculate the average number of fixed points in such networks versus their size and threshold of neurons, in terms of a statistical method, which has been applied to the calculation of the average number of metastable states in spin glass systems. Then the same method is expanded to study the average number of attractors in such networks. The results of the calculation qualitatively agree well with the numerical calculation. The discrepancies between them are also well explained.


2013 ◽  
Vol 2013 ◽  
pp. 1-10 ◽  
Author(s):  
Yanxia Sun ◽  
Zenghui Wang ◽  
Barend Jacobus van Wyk

A new neural network based optimization algorithm is proposed. The presented model is a discrete-time, continuous-state Hopfield neural network and the states of the model are updated synchronously. The proposed algorithm combines the advantages of traditional PSO, chaos and Hopfield neural networks: particles learn from their own experience and the experiences of surrounding particles, their search behavior is ergodic, and convergence of the swarm is guaranteed. The effectiveness of the proposed approach is demonstrated using simulations and typical optimization problems.


2021 ◽  
pp. 1-19
Author(s):  
Masaki Kobayashi

Multistate Hopfield models, such as complex-valued Hopfield neural networks (CHNNs), have been used as multistate neural associative memories. Quaternion-valued Hopfield neural networks (QHNNs) reduce the number of weight parameters of CHNNs. The CHNNs and QHNNs have weak noise tolerance by the inherent property of rotational invariance. Klein Hopfield neural networks (KHNNs) improve the noise tolerance by resolving rotational invariance. However, the KHNNs have another disadvantage of self-feedback, a major factor of deterioration in noise tolerance. In this work, the stability conditions of KHNNs are extended. Moreover, the projection rule for KHNNs is modified using the extended conditions. The proposed projection rule improves the noise tolerance by a reduction in self-feedback. Computer simulations support that the proposed projection rule improves the noise tolerance of KHNNs.


2020 ◽  
Vol 32 (9) ◽  
pp. 1685-1696
Author(s):  
Masaki Kobayashi

For most multistate Hopfield neural networks, the stability conditions in asynchronous mode are known, whereas those in synchronous mode are not. If they were to converge in synchronous mode, recall would be accelerated by parallel processing. Complex-valued Hopfield neural networks (CHNNs) with a projection rule do not converge in synchronous mode. In this work, we provide stability conditions for hyperbolic Hopfield neural networks (HHNNs) in synchronous mode instead of CHNNs. HHNNs provide better noise tolerance than CHNNs. In addition, the stability conditions are applied to the projection rule, and HHNNs with a projection rule converge in synchronous mode. By computer simulations, we find that the projection rule for HHNNs in synchronous mode maintains a high noise tolerance.


Author(s):  
K. Anitha ◽  
R. Dhanalakshmi ◽  
K. Naresh ◽  
D. Rukmani Devi

Neural networks play a significant role in data classification. Complex-valued Hopfield Neural Network (CHNN) is mostly used in various fields including the image classification. Though CHNN has proven its credibility in the classification task, it has a few issues. Activation function of complex-valued neuron maps to a unit circle in the complex plane affecting the resolution factor, flexibility and compatibility to changes, during adaptation in retrieval systems. The proposed work demonstrates Content-Based Image Retrieval System (CBIR) with Hyperbolic Hopfield Neural Networks (HHNN), an analogue of CHNN for classifying images. Activation function of the Hyperbolic neuron is not cyclic in hyperbolic plane. The images are mathematically represented and indexed using the six basic features. The proposed HHNN classifier is trained, tested and evaluated through extensive experiments considering individual features and four combined features for indexing. The obtained results prove that HHNN guides retrieval process, enhances system performance and minimizes the cost of implementing Neural Network Classifier-based image retrieval system.


2021 ◽  
pp. 1-11
Author(s):  
Masaki Kobayashi

Hopfield neural networks have been extended using hypercomplex numbers. The algebra of bicomplex numbers, also referred to as commutative quaternions, is a number system of dimension 4. Since the multiplication is commutative, many notions and theories of linear algebra, such as determinant, are available, unlike quaternions. A bicomplex-valued Hopfield neural network (BHNN) has been proposed as a multistate neural associative memory. However, the stability conditions have been insufficient for the projection rule. In this work, the stability conditions are extended and applied to improvement of the projection rule. The computer simulations suggest improved noise tolerance.


Sign in / Sign up

Export Citation Format

Share Document