scholarly journals Local Convergence of an Efficient High Convergence Order Method Using Hypothesis Only on the First Derivative

Algorithms ◽  
2015 ◽  
Vol 8 (4) ◽  
pp. 1076-1087 ◽  
Author(s):  
Ioannis Argyros ◽  
Ramandeep Behl ◽  
S.S. Motsa
Algorithms ◽  
2020 ◽  
Vol 13 (1) ◽  
pp. 25
Author(s):  
Janak Raj Sharma ◽  
Sunil Kumar ◽  
Ioannis K. Argyros

We discuss the local convergence of a derivative-free eighth order method in a Banach space setting. The present study provides the radius of convergence and bounds on errors under the hypothesis based on the first Fréchet-derivative only. The approaches of using Taylor expansions, containing higher order derivatives, do not provide such estimates since the derivatives may be nonexistent or costly to compute. By using only first derivative, the method can be applied to a wider class of functions and hence its applications are expanded. Numerical experiments show that the present results are applicable to the cases wherein previous results cannot be applied.


Author(s):  
Ioannis K. Argyros ◽  
Santhosh George

Abstract In the present paper, we study the local convergence analysis of a fifth convergence order method considered by Sharma and Guha in [15] to solve equations in Banach space. Using our idea of restricted convergence domains we extend the applicability of this method. Numerical examples where earlier results cannot apply to solve equations but our results can apply are also given in this study.


Author(s):  
Ioannis K. Argyros ◽  
Santhosh George

Abstract Local convergence analysis of a fourth order method considered by Sharma et. al in [19] for solving systems of nonlinear equations. Using conditions on derivatives upto the order five, they proved that the method is of order four. In this study using conditions only on the first derivative, we prove the convergence of the method in [19]. This way we extended the applicability of the method. Numerical example which do not satisfy earlier conditions but satisfy our conditions are presented in this study.


Author(s):  
Ştefan Măruşter

Abstract The aim of this paper is to investigate the local convergence of the Modified Newton method, i.e. the classical Newton method in which the first derivative is re-evaluated periodically after m steps. The convergence order is shown to be m + 1. A new algorithm is proposed for the estimation the convergence radius of the method. We propose also a threshold for the number of steps after which is recommended to re-evaluate the first derivative in the Modified Newton method.


Author(s):  
Ioannis K. Argyros ◽  
Santhosh George

Abstract The aim of this article is to provide the local convergence analysis of two novel competing sixth convergence order methods for solving equations involving Banach space valued operators. Earlier studies have used hypotheses reaching up to the sixth derivative but only the first derivative appears in these methods. These hypotheses limit the applicability of the methods. That is why we are motivated to present convergence analysis based only on the first derivative. Numerical examples where the convergence criteria are tested are provided. It turns out that in these examples the criteria in the earlier works are not satisfied, so these results cannot be used to solve equations but our results can be used.


Algorithms ◽  
2020 ◽  
Vol 13 (6) ◽  
pp. 147
Author(s):  
Samundra Regmi ◽  
Ioannis K. Argyros ◽  
Santhosh George

A local convergence comparison is presented between two ninth order algorithms for solving nonlinear equations. In earlier studies derivatives not appearing on the algorithms up to the 10th order were utilized to show convergence. Moreover, no error estimates, radius of convergence or results on the uniqueness of the solution that can be computed were given. The novelty of our study is that we address all these concerns by using only the first derivative which actually appears on these algorithms. That is how to extend the applicability of these algorithms. Our technique provides a direct comparison between these algorithms under the same set of convergence criteria. This technique can be used on other algorithms. Numerical experiments are utilized to test the convergence criteria.


Author(s):  
Ioannis K. Argyros ◽  
Munish Kansal ◽  
V. Kanwar

Abstract We present a local convergence analysis of an eighth-order method for approximating a locally unique solution of a non-linear equation. Earlier studies such as have shown convergence of these methods under hypotheses up to the seventh derivative of the function although only the first derivative appears in the method. In this study, we expand the applicability of these methods using only hypotheses up to the first derivative of the function. This way the applicability of these methods is extended under weaker hypotheses. Moreover, the radius of convergence and computable error bounds on the distances involved are also given in this study. Numerical examples are also presented in this study.


2019 ◽  
Vol 8 (1) ◽  
pp. 74-79
Author(s):  
Ioannis K. Argyros ◽  
Santhosh George

AbstractThe aim of this study is to extend the applicability of an eighth convergence order method from thek−dimensional Euclidean space to a Banach space setting. We use hypotheses only on the first derivative to show the local convergence of the method. Earlier studies use hypotheses up to the eighth derivative although only the first derivative and a divided difference of order one appear in the method. Moreover, we provide computable error bounds based on Lipschitz-type functions.


Author(s):  
Ioannis K. Argyros ◽  
Santhosh George

AbstractThis paper is devoted to the study of a multi-step method with divided differences for solving nonlinear equations in Banach spaces. In earlier studies, hypotheses on the Fréchet derivative up to the sixth order of the operator under consideration is used to prove the convergence of the method. That restricts the applicability of the method. In this paper we extended the applicability of the sixth-order multi-step method by using only hypotheses on the first derivative of the operator involved. Our convergence conditions are weaker than the conditions used in earlier studies. Numerical examples where earlier results cannot be applied to solve equations but our results can be applied are also given in this study.


Sign in / Sign up

Export Citation Format

Share Document