Control of Recurrent Neural Networks Using Differential Minimax Game: The Deterministic Case

Author(s):  
Ziqian Liu ◽  
Nirwan Ansari

This paper presents a theoretical design of how a minimax equilibrium of differential game is achieved in a class of large-scale nonlinear dynamic systems, namely the recurrent neural networks. In order to realize the equilibrium, we consider the vector of external inputs as a player and the vector of internal noises (or disturbances or modeling errors) as an opposing player. The purpose of this study is to construct a nonlinear H∞ optimal control for deterministic noisy recurrent neural networks to achieve an optimal-oriented stabilization, as well as to attenuate noise to a prescribed level with stability margins. A numerical example demonstrates the effectiveness of the proposed approach.

Author(s):  
Ziqian Liu

This chapter presents a theoretical design of how a global robust control is achieved in a class of noisy recurrent neural networks which is a promising method for modeling the behavior of biological motor-sensor systems. The approach is developed by using differential minimax game, inverse optimality, Lyapunov technique, and the Hamilton-Jacobi-Isaacs equation. In order to implement the theory of differential games into neural networks, we consider the vector of external inputs as a player and the vector of internal noises (or disturbances or modeling errors) as an opposing player. The proposed design achieves global inverse optimality with respect to some meaningful cost functional, global disturbance attenuation, as well as global asymptotic stability provided no disturbance. Finally, numerical examples are used to demonstrate the effectiveness of the proposed design.


2019 ◽  
Vol 2019 (1) ◽  
Author(s):  
M. Iswarya ◽  
R. Raja ◽  
G. Rajchakit ◽  
J. Cao ◽  
J. Alzabut ◽  
...  

AbstractIn this work, the exponential stability problem of impulsive recurrent neural networks is investigated; discrete time delay, continuously distributed delay and stochastic noise are simultaneously taken into consideration. In order to guarantee the exponential stability of our considered recurrent neural networks, two distinct types of sufficient conditions are derived on the basis of the Lyapunov functional and coefficient of our given system and also to construct a Lyapunov function for a large scale system a novel graph-theoretic approach is considered, which is derived by utilizing the Lyapunov functional as well as graph theory. In this approach a global Lyapunov functional is constructed which is more related to the topological structure of the given system. We present a numerical example and simulation figures to show the effectiveness of our proposed work.


Sign in / Sign up

Export Citation Format

Share Document