scholarly journals Exponential convergence for high-order recurrent neural networks with a class of general activation functions

2011 ◽  
Vol 35 (1) ◽  
pp. 123-129 ◽  
Author(s):  
Hong Zhang ◽  
Wentao Wang ◽  
Bing Xiao
2013 ◽  
Vol 2013 ◽  
pp. 1-16 ◽  
Author(s):  
Xiaohong Wang ◽  
Huan Qi

This paper is concerned with the robust dissipativity problem for interval recurrent neural networks (IRNNs) with general activation functions, and continuous time-varying delay, and infinity distributed time delay. By employing a new differential inequality, constructing two different kinds of Lyapunov functions, and abandoning the limitation on activation functions being bounded, monotonous and differentiable, several sufficient conditions are established to guarantee the global robust exponential dissipativity for the addressed IRNNs in terms of linear matrix inequalities (LMIs) which can be easily checked by LMI Control Toolbox in MATLAB. Furthermore, the specific estimation of positive invariant and global exponential attractive sets of the addressed system is also derived. Compared with the previous literatures, the results obtained in this paper are shown to improve and extend the earlier global dissipativity conclusions. Finally, two numerical examples are provided to demonstrate the potential effectiveness of the proposed results.


Inventions ◽  
2021 ◽  
Vol 6 (4) ◽  
pp. 70
Author(s):  
Elena Solovyeva ◽  
Ali Abdullah

In this paper, the structure of a separable convolutional neural network that consists of an embedding layer, separable convolutional layers, convolutional layer and global average pooling is represented for binary and multiclass text classifications. The advantage of the proposed structure is the absence of multiple fully connected layers, which is used to increase the classification accuracy but raises the computational cost. The combination of low-cost separable convolutional layers and a convolutional layer is proposed to gain high accuracy and, simultaneously, to reduce the complexity of neural classifiers. Advantages are demonstrated at binary and multiclass classifications of written texts by means of the proposed networks under the sigmoid and Softmax activation functions in convolutional layer. At binary and multiclass classifications, the accuracy obtained by separable convolutional neural networks is higher in comparison with some investigated types of recurrent neural networks and fully connected networks.


Sign in / Sign up

Export Citation Format

Share Document