Finite-Time Convergent Recurrent Neural Network With a Hard-Limiting Activation Function for Constrained Optimization With Piecewise-Linear Objective Functions

2011 ◽  
Vol 22 (4) ◽  
pp. 601-613 ◽  
Author(s):  
Qingshan Liu ◽  
Jun Wang
Author(s):  
George Mourgias-Alexandris ◽  
George Dabos ◽  
Nikolaos Passalis ◽  
Anastasios Tefas ◽  
Angelina Totovic ◽  
...  

1999 ◽  
Vol 11 (5) ◽  
pp. 1069-1077 ◽  
Author(s):  
Danilo P. Mandic ◽  
Jonathon A. Chambers

A relationship between the learning rate η in the learning algorithm, and the slope β in the nonlinear activation function, for a class of recurrent neural networks (RNNs) trained by the real-time recurrent learning algorithm is provided. It is shown that an arbitrary RNN can be obtained via the referent RNN, with some deterministic rules imposed on its weights and the learning rate. Such relationships reduce the number of degrees of freedom when solving the nonlinear optimization task of finding the optimal RNN parameters.


Complexity ◽  
2017 ◽  
Vol 2017 ◽  
pp. 1-7 ◽  
Author(s):  
Zhan Li ◽  
Hong Cheng ◽  
Hongliang Guo

This brief proposes a general framework of the nonlinear recurrent neural network for solving online the generalized linear matrix equation (GLME) with global convergence property. If the linear activation function is utilized, the neural state matrix of the nonlinear recurrent neural network can globally and exponentially converge to the unique theoretical solution of GLME. Additionally, as compared with the case of using the linear activation function, two specific types of nonlinear activation functions are proposed for the general nonlinear recurrent neural network model to achieve superior convergence. Illustrative examples are shown to demonstrate the efficacy of the general nonlinear recurrent neural network model and its superior convergence when activated by the aforementioned nonlinear activation functions.


2017 ◽  
Vol 47 (9) ◽  
pp. 1226-1241 ◽  
Author(s):  
Shukai DUAN ◽  
Tengteng Guo ◽  
Mengzhe ZHOU ◽  
Lidan WANG

Sign in / Sign up

Export Citation Format

Share Document