scholarly journals A Fully Complex-Valued Gradient Neural Network for Rapidly Computing Complex-Valued Linear Matrix Equations

2017 ◽  
Vol 26 (6) ◽  
pp. 1194-1197 ◽  
Author(s):  
Lin Xiao ◽  
Rongbo Lu
Author(s):  
Shan Liao ◽  
Jiayong Liu ◽  
Yimeng Qi ◽  
Haoen Huang ◽  
Rongfeng Zheng ◽  
...  

Author(s):  
R. Penrose

This paper describes a generalization of the inverse of a non-singular matrix, as the unique solution of a certain set of equations. This generalized inverse exists for any (possibly rectangular) matrix whatsoever with complex elements. It is used here for solving linear matrix equations, and among other applications for finding an expression for the principal idempotent elements of a matrix. Also a new type of spectral decomposition is given.


Filomat ◽  
2012 ◽  
Vol 26 (3) ◽  
pp. 607-613 ◽  
Author(s):  
Xiang Wang ◽  
Dan Liao

A hierarchical gradient based iterative algorithm of [L. Xie et al., Computers and Mathematics with Applications 58 (2009) 1441-1448] has been presented for finding the numerical solution for general linear matrix equations, and the convergent factor has been discussed by numerical experiments. However, they pointed out that how to choose a best convergence factor is still a project to be studied. In this paper, we discussed the optimal convergent factor for the gradient based iterative algorithm and obtained the optimal convergent factor. Moreover, the theoretical results of this paper can be extended to other methods of gradient-type based. Results of numerical experiments are consistent with the theoretical findings.


Sign in / Sign up

Export Citation Format

Share Document