A new stability condition for discrete time linear threshold recurrent neural networks

Author(s):  
Wei Zhou ◽  
Jacek M. Zurada
2010 ◽  
Vol 22 (8) ◽  
pp. 2137-2160 ◽  
Author(s):  
Wei Zhou ◽  
Jacek M. Zurada

This letter discusses the competitive layer model (CLM) for a class of discrete-time recurrent neural networks with linear threshold (LT) neurons. It first addresses the boundedness, global attractivity, and complete stability of the networks. Two theorems are then presented for the networks to have CLM property. We also present the analysis for network dynamics, which performs a column winner-take-all behavior and grouping selection among different layers. Furthermore, we propose a novel synchronous CLM iteration method, which has similar performance and storage allocation but faster convergence compared with the previous asynchronous CLM iteration method (Wersing, Steil, & Ritter, 2001 ). Examples and simulation results are used to illustrate the developed theory, the comparison between two CLM iteration methods, and the application in image segmentation.


Sign in / Sign up

Export Citation Format

Share Document