generalized neural networks
Recently Published Documents


TOTAL DOCUMENTS

73
(FIVE YEARS 24)

H-INDEX

21
(FIVE YEARS 4)

Author(s):  
Grienggrai Rajchakit ◽  
Ramalingam Sriraman ◽  
Rajendran Samidurai

Abstract This article discusses the dissipativity analysis of stochastic generalized neural network (NN) models with Markovian jump parameters and time-varying delays. In practical applications, most of the systems are subject to stochastic perturbations. As such, this study takes a class of stochastic NN models into account. To undertake this problem, we first construct an appropriate Lyapunov–Krasovskii functional with more system information. Then, by employing effective integral inequalities, we derive several dissipativity and stability criteria in the form of linear matrix inequalities that can be checked by the MATLAB LMI toolbox. Finally, we also present numerical examples to validate the usefulness of the results.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Sunisa Luemsai ◽  
Thongchai Botmart ◽  
Wajaree Weera

AbstractThe problem of asymptotic stability and extended dissipativity analysis for the generalized neural networks with interval discrete and distributed time-varying delays is investigated. Based on a suitable Lyapunov–Krasovskii functional (LKF), an improved Wirtinger single integral inequality, a novel triple integral inequality, and convex combination technique, the new asymptotic stability and extended dissipativity criteria are achieved for the generalized neural networks with interval discrete and distributed time-varying delays. By the above methods, the less conservative asymptotic stability criteria are obtained for a special case of the generalized neural networks. By using the Matlab LMI toolbox, the derived new asymptotic stability and extended dissipativity criteria are expressed in terms of linear matrix inequalities (LMIs) that cover $H_{\infty }$ H ∞ , $L_{2}$ L 2 –$L_{\infty }$ L ∞ , passivity, and dissipativity performance by setting parameters in the general performance index. Finally, we show numerical examples which are less conservative than other examples in the literature. Moreover, we present numerical examples for asymptotic stability and extended dissipativity performance of the generalized neural networks, including a special case of the generalized neural networks.


Symmetry ◽  
2020 ◽  
Vol 12 (6) ◽  
pp. 1035
Author(s):  
Usa Humphries ◽  
Grienggrai Rajchakit ◽  
Ramalingam Sriraman ◽  
Pramet Kaewmesri ◽  
Pharunyou Chanthorn ◽  
...  

The main focus of this research is on a comprehensive analysis of robust dissipativity issues pertaining to a class of uncertain stochastic generalized neural network (USGNN) models in the presence of time-varying delays and Markovian jumping parameters (MJPs). In real-world environments, most practical systems are subject to uncertainties. As a result, we take the norm-bounded parameter uncertainties, as well as stochastic disturbances into consideration in our study. To address the task, we formulate the appropriate Lyapunov–Krasovskii functional (LKF), and through the use of effective integral inequalities, simplified linear matrix inequality (LMI) based sufficient conditions are derived. We validate the feasible solutions through numerical examples using MATLAB software. The simulation results are analyzed and discussed, which positively indicate the feasibility and effectiveness of the obtained theoretical findings.


Sign in / Sign up

Export Citation Format

Share Document