scholarly journals Some Remarks on the Stability of Discrete-Time Complex-Valued Multistate Hopfield Neural Networks

Author(s):  
Fidelis Zanetti de Castro ◽  
Marcos Eduardo Valle
2021 ◽  
pp. 1-19
Author(s):  
Masaki Kobayashi

Multistate Hopfield models, such as complex-valued Hopfield neural networks (CHNNs), have been used as multistate neural associative memories. Quaternion-valued Hopfield neural networks (QHNNs) reduce the number of weight parameters of CHNNs. The CHNNs and QHNNs have weak noise tolerance by the inherent property of rotational invariance. Klein Hopfield neural networks (KHNNs) improve the noise tolerance by resolving rotational invariance. However, the KHNNs have another disadvantage of self-feedback, a major factor of deterioration in noise tolerance. In this work, the stability conditions of KHNNs are extended. Moreover, the projection rule for KHNNs is modified using the extended conditions. The proposed projection rule improves the noise tolerance by a reduction in self-feedback. Computer simulations support that the proposed projection rule improves the noise tolerance of KHNNs.


2020 ◽  
Vol 32 (9) ◽  
pp. 1685-1696
Author(s):  
Masaki Kobayashi

For most multistate Hopfield neural networks, the stability conditions in asynchronous mode are known, whereas those in synchronous mode are not. If they were to converge in synchronous mode, recall would be accelerated by parallel processing. Complex-valued Hopfield neural networks (CHNNs) with a projection rule do not converge in synchronous mode. In this work, we provide stability conditions for hyperbolic Hopfield neural networks (HHNNs) in synchronous mode instead of CHNNs. HHNNs provide better noise tolerance than CHNNs. In addition, the stability conditions are applied to the projection rule, and HHNNs with a projection rule converge in synchronous mode. By computer simulations, we find that the projection rule for HHNNs in synchronous mode maintains a high noise tolerance.


2014 ◽  
Vol 69 (1-2) ◽  
pp. 70-80 ◽  
Author(s):  
Mathiyalagan Kalidass ◽  
Hongye Su ◽  
Sakthivel Rathinasamy

This paper presents a robust analysis approach to stochastic stability of the uncertain Markovian jumping discrete-time neural networks (MJDNNs) with time delay in the leakage term. By choosing an appropriate Lyapunov functional and using free weighting matrix technique, a set of delay dependent stability criteria are derived. The stability results are delay dependent, which depend on not only the upper bounds of time delays but also their lower bounds. The obtained stability criteria are established in terms of linear matrix inequalities (LMIs) which can be effectively solved by some standard numerical packages. Finally, some illustrative numerical examples with simulation results are provided to demonstrate applicability of the obtained results. It is shown that even if there is no leakage delay, the obtained results are less restrictive than in some recent works.


2020 ◽  
Vol 122 ◽  
pp. 54-67 ◽  
Author(s):  
Fidelis Zanetti de Castro ◽  
Marcos Eduardo Valle

2019 ◽  
Vol 2019 ◽  
pp. 1-13
Author(s):  
YaJun Li ◽  
Quanxin Zhu

This paper is concerned with the stability problem of a class of discrete-time stochastic fuzzy neural networks with mixed delays. New Lyapunov-Krasovskii functions are proposed and free weight matrices are introduced. The novel sufficient conditions for the stability of discrete-time stochastic fuzzy neural networks with mixed delays are established in terms of linear matrix inequalities (LMIs). Finally, numerical examples are given to illustrate the effectiveness and benefits of the proposed method.


2010 ◽  
Vol 88 (12) ◽  
pp. 885-898 ◽  
Author(s):  
R. Raja ◽  
R. Sakthivel ◽  
S. Marshal Anthoni

This paper investigates the stability issues for a class of discrete-time stochastic neural networks with mixed time delays and impulsive effects. By constructing a new Lyapunov–Krasovskii functional and combining with the linear matrix inequality (LMI) approach, a novel set of sufficient conditions are derived to ensure the global asymptotic stability of the equilibrium point for the addressed discrete-time neural networks. Then the result is extended to address the problem of robust stability of uncertain discrete-time stochastic neural networks with impulsive effects. One important feature in this paper is that the stability of the equilibrium point is proved under mild conditions on the activation functions, and it is not required to be differentiable or strictly monotonic. In addition, two numerical examples are provided to show the effectiveness of the proposed method, while being less conservative.


2020 ◽  
Vol 32 (11) ◽  
pp. 2237-2248
Author(s):  
Masaki Kobayashi

A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.


Sign in / Sign up

Export Citation Format

Share Document