Recurrent Neural Networks with Unsaturating Piecewise Linear Activation Functions

Author(s):  
Zhang Yi ◽  
K. K. Tan
2014 ◽  
Vol 28 (19) ◽  
pp. 1450118 ◽  
Author(s):  
Huaguang Zhang ◽  
Yujiao Huang ◽  
Tiaoyang Cai ◽  
Zhanshan Wang

In this paper, multistability is discussed for delayed recurrent neural networks with ring structure and multi-step piecewise linear activation functions. Sufficient criteria are obtained to check the existence of multiple equilibria. A lemma is proposed to explore the number and the cross-direction of purely imaginary roots for the characteristic equation, which corresponds to the neural network model. Stability of all of equilibria is investigated. The work improves and extends the existing stability results in the literature. Finally, two examples are given to illustrate the effectiveness of the obtained results.


Inventions ◽  
2021 ◽  
Vol 6 (4) ◽  
pp. 70
Author(s):  
Elena Solovyeva ◽  
Ali Abdullah

In this paper, the structure of a separable convolutional neural network that consists of an embedding layer, separable convolutional layers, convolutional layer and global average pooling is represented for binary and multiclass text classifications. The advantage of the proposed structure is the absence of multiple fully connected layers, which is used to increase the classification accuracy but raises the computational cost. The combination of low-cost separable convolutional layers and a convolutional layer is proposed to gain high accuracy and, simultaneously, to reduce the complexity of neural classifiers. Advantages are demonstrated at binary and multiclass classifications of written texts by means of the proposed networks under the sigmoid and Softmax activation functions in convolutional layer. At binary and multiclass classifications, the accuracy obtained by separable convolutional neural networks is higher in comparison with some investigated types of recurrent neural networks and fully connected networks.


2007 ◽  
Vol 19 (8) ◽  
pp. 2149-2182 ◽  
Author(s):  
Zhigang Zeng ◽  
Jun Wang

In this letter, some sufficient conditions are obtained to guarantee recurrent neural networks with linear saturation activation functions, and time-varying delays have multiequilibria located in the saturation region and the boundaries of the saturation region. These results on pattern characterization are used to analyze and design autoassociative memories, which are directly based on the parameters of the neural networks. Moreover, a formula for the numbers of spurious equilibria is also derived. Four design procedures for recurrent neural networks with linear saturation activation functions and time-varying delays are developed based on stability results. Two of these procedures allow the neural network to be capable of learning and forgetting. Finally, simulation results demonstrate the validity and characteristics of the proposed approach.


Sign in / Sign up

Export Citation Format

Share Document