Convergence Analysis of Three Classes of Split-Complex Gradient Algorithms for Complex-Valued Recurrent Neural Networks

2010 ◽  
Vol 22 (10) ◽  
pp. 2655-2677 ◽  
Author(s):  
Dongpo Xu ◽  
Huisheng Zhang ◽  
Lijun Liu

This letter presents a unified convergence analysis of the split-complex nonlinear gradient descent (SCNGD) learning algorithms for complex-valued recurrent neural networks, covering three classes of SCNGD algorithms: standard SCNGD, normalized SCNGD, and adaptive normalized SCNGD. We prove that if the activation functions are of split-complex type and some conditions are satisfied, the error function is monotonically decreasing during the training iteration process, and the gradients of the error function with respect to the real and imaginary parts of the weights converge to zero. A strong convergence result is also obtained under the assumption that the error function has only a finite number of stationary points. The simulation results are given to support the theoretical analysis.

2010 ◽  
Vol 2010 ◽  
pp. 1-27
Author(s):  
Huisheng Zhang ◽  
Dongpo Xu ◽  
Zhiping Wang

The online gradient method has been widely used in training neural networks. We consider in this paper an online split-complex gradient algorithm for complex-valued neural networks. We choose an adaptive learning rate during the training procedure. Under certain conditions, by firstly showing the monotonicity of the error function, it is proved that the gradient of the error function tends to zero and the weight sequence tends to a fixed point. A numerical example is given to support the theoretical findings.


2009 ◽  
Vol 2009 ◽  
pp. 1-16 ◽  
Author(s):  
Huisheng Zhang ◽  
Chao Zhang ◽  
Wei Wu

The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.


Sign in / Sign up

Export Citation Format

Share Document