scholarly journals A general class of arbitrary order iterative methods for computing generalized inverses

2021 ◽  
Vol 409 ◽  
pp. 126381
Author(s):  
Alicia Cordero ◽  
Pablo Soto-Quiros ◽  
Juan R. Torregrosa
Mathematics ◽  
2019 ◽  
Vol 8 (1) ◽  
pp. 2
Author(s):  
Santiago Artidiello ◽  
Alicia Cordero ◽  
Juan R. Torregrosa ◽  
María P. Vassileva

A secant-type method is designed for approximating the inverse and some generalized inverses of a complex matrix A. For a nonsingular matrix, the proposed method gives us an approximation of the inverse and, when the matrix is singular, an approximation of the Moore–Penrose inverse and Drazin inverse are obtained. The convergence and the order of convergence is presented in each case. Some numerical tests allowed us to confirm the theoretical results and to compare the performance of our method with other known ones. With these results, the iterative methods with memory appear for the first time for estimating the solution of a nonlinear matrix equations.


2011 ◽  
Vol 2011 ◽  
pp. 1-10 ◽  
Author(s):  
F. Soleymani

This paper contributes a very general class of two-point iterative methods without memory for solving nonlinear equations. The class of methods is developed using weight function approach. Per iteration, each method of the class includes two evaluations of the function and one of its first-order derivative. The analytical study of the main theorem is presented in detail to show the fourth order of convergence. Furthermore, it is discussed that many of the existing fourth-order methods without memory are members from this developed class. Finally, numerical examples are taken into account to manifest the accuracy of the derived methods.


2019 ◽  
Vol 57 (5) ◽  
pp. 1448-1471
Author(s):  
Fiza Zafar ◽  
Alicia Cordero ◽  
Juan R. Torregrosa ◽  
Aneeqa Rafi

Filomat ◽  
2017 ◽  
Vol 31 (10) ◽  
pp. 2999-3014 ◽  
Author(s):  
Igor Stojanovic ◽  
Predrag Stanimirovic ◽  
Ivan Zivkovic ◽  
Dimitrios Gerontitis ◽  
Xue-Zhong Wang

Our goal is to investigate and exploit an analogy between the scaled hyperpower family (SHPI family) of iterative methods for computing the matrix inverse and the discretization of Zhang Neural Network (ZNN) models. A class of ZNN models corresponding to the family of hyperpower iterative methods for computing generalized inverses is defined on the basis of the discovered analogy. The Simulink implementation in Matlab of the introduced ZNN models is described in the case of scaled hyperpower methods of the order 2 and 3. Convergence properties of the proposed ZNN models are investigated as well as their numerical behavior.


Sign in / Sign up

Export Citation Format

Share Document