scholarly journals A Family of Fifth and Sixth Convergence Order Methods for Nonlinear Models

Symmetry ◽  
2021 ◽  
Vol 13 (4) ◽  
pp. 715
Author(s):  
Ioannis K. Argyros ◽  
Debasis Sharma ◽  
Christopher I. Argyros ◽  
Sanjaya Kumar Parhi ◽  
Shanta Kumari Sunanda

We study the local convergence of a family of fifth and sixth convergence order derivative free methods for solving Banach space valued nonlinear models. Earlier results used hypotheses up to the seventh derivative to show convergence. However, we only use the first divided difference of order one as well as the first derivative in our analysis. We also provide computable radius of convergence, error estimates, and uniqueness of the solution results not given in earlier studies. Hence, we expand the applicability of these methods. The dynamical analysis of the discussed family is also presented. Numerical experiments complete this article.

Algorithms ◽  
2020 ◽  
Vol 13 (6) ◽  
pp. 147
Author(s):  
Samundra Regmi ◽  
Ioannis K. Argyros ◽  
Santhosh George

A local convergence comparison is presented between two ninth order algorithms for solving nonlinear equations. In earlier studies derivatives not appearing on the algorithms up to the 10th order were utilized to show convergence. Moreover, no error estimates, radius of convergence or results on the uniqueness of the solution that can be computed were given. The novelty of our study is that we address all these concerns by using only the first derivative which actually appears on these algorithms. That is how to extend the applicability of these algorithms. Our technique provides a direct comparison between these algorithms under the same set of convergence criteria. This technique can be used on other algorithms. Numerical experiments are utilized to test the convergence criteria.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Samundra Regmi ◽  
Christopher I. Argyros ◽  
Ioannis K. Argyros ◽  
Santhosh George

Abstract The applicability of an efficient sixth convergence order scheme is extended for solving Banach space valued equations. In previous works, the seventh derivative has been used not appearing on the scheme. But we use only the first derivative that appears on the scheme. Moreover, bounds on the error distances and results on the uniqueness of the solution are provided (not given in earlier works) based on ω–continuity conditions. Numerical examples complete this article.


2021 ◽  
pp. 246-257
Author(s):  
Ioannis K. Argyros ◽  
Santhosh George ◽  
Christopher I. Argyros

The applicability of two competing efficient sixth convergence order schemes is extended for solving Banach space valued equations. In previous works, the seventh derivative has been used not appearing on the schemes. But we use only the first derivative that appears on the scheme. Moreover, bounds on the error distances and results on the uniqueness of the solution are provided not given in the earlier works based on ω-continuity conditions. Our technique extends other schemes analogously, since it is so general. Numerical examples complete this work.


Algorithms ◽  
2020 ◽  
Vol 13 (1) ◽  
pp. 25
Author(s):  
Janak Raj Sharma ◽  
Sunil Kumar ◽  
Ioannis K. Argyros

We discuss the local convergence of a derivative-free eighth order method in a Banach space setting. The present study provides the radius of convergence and bounds on errors under the hypothesis based on the first Fréchet-derivative only. The approaches of using Taylor expansions, containing higher order derivatives, do not provide such estimates since the derivatives may be nonexistent or costly to compute. By using only first derivative, the method can be applied to a wider class of functions and hence its applications are expanded. Numerical experiments show that the present results are applicable to the cases wherein previous results cannot be applied.


2021 ◽  
Vol 56 (1) ◽  
pp. 72-82
Author(s):  
I.K. Argyros ◽  
D. Sharma ◽  
C.I. Argyros ◽  
S.K. Parhi ◽  
S.K. Sunanda ◽  
...  

In the earlier work, expensive Taylor formula and conditions on derivatives up to the eighthorder have been utilized to establish the convergence of a derivative free class of seventh orderiterative algorithms. Moreover, no error distances or results on uniqueness of the solution weregiven. In this study, extended ball convergence analysis is derived for this class by imposingconditions on the first derivative. Additionally, we offer error distances and convergence radiustogether with the region of uniqueness for the solution. Therefore, we enlarge the practicalutility of these algorithms. Also, convergence regions of a specific member of this class are displayedfor solving complex polynomial equations. At the end, standard numerical applicationsare provided to illustrate the efficacy of our theoretical findings.


2014 ◽  
Vol 2014 ◽  
pp. 1-6 ◽  
Author(s):  
M. Sharifi ◽  
S. Karimi Vanani ◽  
F. Khaksar Haghani ◽  
M. Arab ◽  
S. Shateyi

The purpose of this paper is to derive and discuss a three-step iterative expression for solving nonlinear equations. In fact, we derive a derivative-free form for one of the existing optimal eighth-order methods and preserve its convergence order. Theoretical results will be upheld by numerical experiments.


2021 ◽  
Vol 40 (3) ◽  
Author(s):  
Qiumei Huang ◽  
Min Wang

AbstractIn this paper, we discuss the superconvergence of the “interpolated” collocation solutions for weakly singular Volterra integral equations of the second kind. Based on the collocation solution $$u_h$$ u h , two different interpolation postprocessing approximations of higher accuracy: $$I_{2h}^{2m-1}u_h$$ I 2 h 2 m - 1 u h based on the collocation points and $$I_{2h}^{m}u_h$$ I 2 h m u h based on the least square scheme are constructed, whose convergence order are the same as that of the iterated collocation solution. Such interpolation postprocessing methods are much simpler in computation. We further apply this interpolation postprocessing technique to hybrid collocation solutions and similar results are obtained. Numerical experiments are shown to demonstrate the efficiency of the interpolation postprocessing methods.


2015 ◽  
Vol 2015 ◽  
pp. 1-5
Author(s):  
M. Sharifi ◽  
S. Karimi Vanani ◽  
F. Khaksar Haghani ◽  
M. Arab ◽  
S. Shateyi

The aim of this paper is to construct a method with memory according to King’s family of methods without memory for nonlinear equations. It is proved that the proposed method possesses higherR-order of convergence using the same number of functional evaluations as King’s family. Numerical experiments are given to illustrate the performance of the constructed scheme.


2019 ◽  
Vol 53 (2) ◽  
pp. 657-666
Author(s):  
Mohammad Afzalinejad

A problem with rapidly convergent methods for unconstrained optimization like the Newton’s method is the computational difficulties arising specially from the second derivative. In this paper, a class of methods for solving unconstrained optimization problems is proposed which implicitly applies approximations to derivatives. This class of methods is based on a modified Steffensen method for finding roots of a function and attempts to make a quadratic model for the function without using the second derivative. Two methods of this kind with non-expensive computations are proposed which just use first derivative of the function. Derivative-free versions of these methods are also suggested for the cases where the gradient formulas are not available or difficult to evaluate. The theory as well as numerical examinations confirm the rapid convergence of this class of methods.


Sign in / Sign up

Export Citation Format

Share Document