scholarly journals Dissipativity analysis of Markovian Switched Neural Networks using Extended Reciprocally Convex Matrix Inequality

2021 ◽  
Vol 1850 (1) ◽  
pp. 012078
Author(s):  
K Maheswari ◽  
S N Shivapriya
2020 ◽  
Vol 25 (2) ◽  
Author(s):  
Rajarathinam Vadivel ◽  
M. Syed Ali ◽  
Faris Alzahrani ◽  
Jinde Cao ◽  
Young Hoon Joo

This paper addresses the problem of synchronization for decentralized event-triggered uncertain switched neural networks with two additive time-varying delays. A decentralized eventtriggered scheme is employed to determine the time instants of communication from the sensors to the central controller based on narrow possible information only. In addition, a class of switched neural networks is analyzed based on the Lyapunov–Krasovskii functional method and a combined linear matrix inequality (LMI) technique and average dwell time approach. Some sufficient conditions are derived to guarantee the exponential stability of neural networks under consideration in the presence of admissible parametric uncertainties. Numerical examples are provided to illustrate the effectiveness of the obtained results. 


2018 ◽  
Vol 450 ◽  
pp. 169-181 ◽  
Author(s):  
Wen-Juan Lin ◽  
Yong He ◽  
Chuan-Ke Zhang ◽  
Fei Long ◽  
Min Wu

2019 ◽  
Vol 42 (2) ◽  
pp. 330-336
Author(s):  
Dongbing Tong ◽  
Qiaoyu Chen ◽  
Wuneng Zhou ◽  
Yuhua Xu

This paper proposes the [Formula: see text]-matrix method to achieve state estimation in Markov switched neural networks with Lévy noise, and the method is very distinct from the linear matrix inequality technique. Meanwhile, in light of the Lyapunov stability theory, some sufficient conditions of the exponential stability are derived for delayed neural networks, and the adaptive update law is obtained. An example verifies the condition of state estimation and confirms the effectiveness of results.


2010 ◽  
Vol 2010 ◽  
pp. 1-14 ◽  
Author(s):  
Choon Ki Ahn

A new robust training law, which is called an input/output-to-state stable training law (IOSSTL), is proposed for dynamic neural networks with external disturbance. Based on linear matrix inequality (LMI) formulation, the IOSSTL is presented to not only guarantee exponential stability but also reduce the effect of an external disturbance. It is shown that the IOSSTL can be obtained by solving the LMI, which can be easily facilitated by using some standard numerical packages. Numerical examples are presented to demonstrate the validity of the proposed IOSSTL.


2010 ◽  
Vol 2010 ◽  
pp. 1-19 ◽  
Author(s):  
Qiankun Song ◽  
Jinde Cao

The problems on global dissipativity and global exponential dissipativity are investigated for uncertain discrete-time neural networks with time-varying delays and general activation functions. By constructing appropriate Lyapunov-Krasovskii functionals and employing linear matrix inequality technique, several new delay-dependent criteria for checking the global dissipativity and global exponential dissipativity of the addressed neural networks are established in linear matrix inequality (LMI), which can be checked numerically using the effective LMI toolbox in MATLAB. Illustrated examples are given to show the effectiveness of the proposed criteria. It is noteworthy that because neither model transformation nor free-weighting matrices are employed to deal with cross terms in the derivation of the dissipativity criteria, the obtained results are less conservative and more computationally efficient.


2018 ◽  
Vol 275 ◽  
pp. 488-498 ◽  
Author(s):  
Xiaoqing Li ◽  
Kun She ◽  
Shouming Zhong ◽  
Jun Cheng ◽  
Kaibo Shi ◽  
...  

2012 ◽  
Vol 2012 ◽  
pp. 1-8 ◽  
Author(s):  
Yangfan Wang ◽  
Linshan Wang

This paper studies the problems of global exponential robust stability of high-order hopfield neural networks with time-varying delays. By employing a new Lyapunov-Krasovskii functional and linear matrix inequality, some criteria of global exponential robust stability for the high-order neural networks are established, which are easily verifiable and have a wider adaptive.


2017 ◽  
Vol 10 (02) ◽  
pp. 1750027 ◽  
Author(s):  
Wei Zhang ◽  
Chuandong Li ◽  
Tingwen Huang

In this paper, the stability and periodicity of memristor-based neural networks with time-varying delays are studied. Based on linear matrix inequalities, differential inclusion theory and by constructing proper Lyapunov functional approach and using linear matrix inequality, some sufficient conditions are obtained for the global exponential stability and periodic solutions of memristor-based neural networks. Finally, two illustrative examples are given to demonstrate the results.


Sign in / Sign up

Export Citation Format

Share Document