scholarly journals Convergence Analysis and Complex Geometry of an Efficient Derivative-Free Iterative Method

Mathematics ◽  
2019 ◽  
Vol 7 (10) ◽  
pp. 919
Author(s):  
Deepak Kumar ◽  
Janak Raj Sharma ◽  
Lorentz Jäntschi

To locate a locally-unique solution of a nonlinear equation, the local convergence analysis of a derivative-free fifth order method is studied in Banach space. This approach provides radius of convergence and error bounds under the hypotheses based on the first Fréchet-derivative only. Such estimates are not introduced in the earlier procedures employing Taylor’s expansion of higher derivatives that may not exist or may be expensive to compute. The convergence domain of the method is also shown by a visual approach, namely basins of attraction. Theoretical results are endorsed via numerical experiments that show the cases where earlier results cannot be applicable.

Mathematics ◽  
2021 ◽  
Vol 9 (19) ◽  
pp. 2510
Author(s):  
Deepak Kumar ◽  
Sunil Kumar ◽  
Janak Raj Sharma ◽  
Lorentz Jantschi

We study the local convergence analysis of a fifth order method and its multi-step version in Banach spaces. The hypotheses used are based on the first Fréchet-derivative only. The new approach provides a computable radius of convergence, error bounds on the distances involved, and estimates on the uniqueness of the solution. Such estimates are not provided in the approaches using Taylor expansions of higher order derivatives, which may not exist or may be very expensive or impossible to compute. Numerical examples are provided to validate the theoretical results. Convergence domains of the methods are also checked through complex geometry shown by drawing basins of attraction. The boundaries of the basins show fractal-like shapes through which the basins are symmetric.


2022 ◽  
Vol 40 ◽  
pp. 1-18
Author(s):  
J. R. Sharma ◽  
Ioannis K. Argyros ◽  
Deepak Kumar

We introduce a new faster  King-Werner-type derivative-free method for solving nonlinear equations. The local as well as semi-local  convergence analysis is presented under weak center Lipschitz and Lipschitz conditions. The convergence order as well as the convergence radii are also provided. The radii are compared to the corresponding ones from similar methods. Numerical examples further validate the theoretical results.


2018 ◽  
Vol 27 (1) ◽  
pp. 01-08
Author(s):  
IOANNIS K. ARGYROS ◽  
◽  
GEORGE SANTHOSH ◽  

We present a semi-local convergence analysis for a Newton-like method to approximate solutions of equations when the derivative is not necessarily non-singular in a Banach space setting. In the special case when the equation is defined on the real line the convergence domain is improved for this method when compared to earlier results. Numerical results where earlier results cannot apply but the new results can apply to solve nonlinear equations are also presented in this study.


2021 ◽  
Vol 4 (1) ◽  
pp. 34-43
Author(s):  
Samundra Regmi ◽  
◽  
Ioannis K. Argyros ◽  
Santhosh George ◽  
◽  
...  

In this study a convergence analysis for a fast multi-step Chebyshe-Halley-type method for solving nonlinear equations involving Banach space valued operator is presented. We introduce a more precise convergence region containing the iterates leading to tighter Lipschitz constants and functions. This way advantages are obtained in both the local as well as the semi-local convergence case under the same computational cost such as: extended convergence domain, tighter error bounds on the distances involved and a more precise information on the location of the solution. The new technique can be used to extend the applicability of other iterative methods. The numerical examples further validate the theoretical results.


Mathematics ◽  
2019 ◽  
Vol 7 (3) ◽  
pp. 299 ◽  
Author(s):  
Ioannis Argyros ◽  
Á. Magreñán ◽  
Lara Orcos ◽  
Íñigo Sarría

The aim of this paper is to present a new semi-local convergence analysis for Newton’s method in a Banach space setting. The novelty of this paper is that by using more precise Lipschitz constants than in earlier studies and our new idea of restricted convergence domains, we extend the applicability of Newton’s method as follows: The convergence domain is extended; the error estimates are tighter and the information on the location of the solution is at least as precise as before. These advantages are obtained using the same information as before, since new Lipschitz constant are tighter and special cases of the ones used before. Numerical examples and applications are used to test favorable the theoretical results to earlier ones.


2020 ◽  
Vol 36 (3) ◽  
pp. 365-372
Author(s):  
I. K. ARGYROS ◽  
R. P. IAKYMCHUK ◽  
S. M. SHAKHNO ◽  
H. P. YARMOLA

We present a local convergence analysis of a two-step Gauss-Newton method under the generalized and classical Lipschitz conditions for the first- and second-order derivatives. In contrast to earlier works, we use our new idea using a center average Lipschitz conditions through which, we define a subset of the original domain that also contains the iterates. Then, the remaining average Lipschitz conditions are at least as tight as the corresponding ones in earlier works. This way, we obtain: weaker sufficient convergence criteria, larger radius of convergence, tighter error estimates and more precise information on the location of the solution. These advantages are obtained under the same computational effort, since the new Lipschitz functions are special cases of the ones in earlier works. Finally, we give a numerical example that confirms the theoretical results, and compares favorably to the results from previous works.


2018 ◽  
Vol 15 (03) ◽  
pp. 1850010 ◽  
Author(s):  
Janak Raj Sharma ◽  
Ioannis K. Argyros ◽  
Deepak Kumar

We develop a general class of derivative free iterative methods with optimal order of convergence in the sense of Kung–Traub hypothesis for solving nonlinear equations. The methods possess very simple design, which makes them easy to remember and hence easy to implement. The Methodology is based on quadratically convergent Traub–Steffensen scheme and further developed by using Padé approximation. Local convergence analysis is provided to show that the iterations are locally well defined and convergent. Numerical examples are provided to confirm the theoretical results and to show the good performance of new methods.


Mathematics ◽  
2020 ◽  
Vol 8 (5) ◽  
pp. 709 ◽  
Author(s):  
Deepak Kumar ◽  
Janak Raj Sharma ◽  
Ioannis K. Argyros

We suggest a derivative-free optimal method of second order which is a new version of a modification of Newton’s method for achieving the multiple zeros of nonlinear single variable functions. Iterative methods without derivatives for multiple zeros are not easy to obtain, and hence such methods are rare in literature. Inspired by this fact, we worked on a family of optimal second order derivative-free methods for multiple zeros that require only two function evaluations per iteration. The stability of the methods was validated through complex geometry by drawing basins of attraction. Moreover, applicability of the methods is demonstrated herein on different functions. The study of numerical results shows that the new derivative-free methods are good alternatives to the existing optimal second-order techniques that require derivative calculations.


Symmetry ◽  
2019 ◽  
Vol 11 (4) ◽  
pp. 518 ◽  
Author(s):  
Janak Raj Sharma ◽  
Deepak Kumar ◽  
Ioannis K. Argyros

Many higher order multiple-root solvers that require derivative evaluations are available in literature. Contrary to this, higher order multiple-root solvers without derivatives are difficult to obtain, and therefore, such techniques are yet to be achieved. Motivated by this fact, we focus on developing a new family of higher order derivative-free solvers for computing multiple zeros by using a simple approach. The stability of the techniques is checked through complex geometry shown by drawing basins of attraction. Applicability is demonstrated on practical problems, which illustrates the efficient convergence behavior. Moreover, the comparison of numerical results shows that the proposed derivative-free techniques are good competitors of the existing techniques that require derivative evaluations in the iteration.


Sign in / Sign up

Export Citation Format

Share Document