Further Result for Globally Asymptotic Stability of a Class of Memristor-Based Recurrent Neural Networks with Time-Varying Delays

2016 ◽  
Vol 9 (2) ◽  
pp. 19-30
Author(s):  
Jing Liu ◽  
Fang Qiu ◽  
Liguo Huang
2013 ◽  
Vol 380-384 ◽  
pp. 2030-2033
Author(s):  
Zhen Cai Li ◽  
Yang Wang

This paper considers the problem of globally asymptotic stability of the recurrent neural networks with time-varying delays. A linear matrix inequality (LMI) technology and Lyapunov functional method is employed by combing the means of the nonsmooth analysis. A few new sufficient conditions and criterions were proposed to ensure the delayed recurrent neural networks are uniqueness and globally asymptotic stability of their equilibrium point. A few simulation examples are presented to demonstrate the effectiveness of the results and to improve feasibility.


Author(s):  
JUN WANG

Asymptotic properties of recurrent neural networks for optimization are analyzed. Specifically, asymptotic stability of recurrent neural networks with monotonically time-varying penalty parameters for optimization is proven; sufficient conditions of feasibility and optimality of solutions generated by the recurrent neural networks are characterized. Design methodology of the recurrent neural networks for solving optimization problems is discussed. Operating characteristics of the recurrent neural networks are also presented using illustrative examples.


Sign in / Sign up

Export Citation Format

Share Document