scholarly journals A Modified Ren’s Method with Memory Using a Simple Self-Accelerating Parameter

Mathematics ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. 540 ◽  
Author(s):  
Xiaofeng Wang ◽  
Qiannan Fan

In this paper, a self-accelerating type method is proposed for solving nonlinear equations, which is a modified Ren’s method. A simple way is applied to construct a variable self-accelerating parameter of the new method, which does not increase any computational costs. The highest convergence order of new method is 2 + 6 ≈ 4.4495 . Numerical experiments are made to show the performance of the new method, which supports the theoretical results.

2011 ◽  
Vol 5 (2) ◽  
pp. 298-317 ◽  
Author(s):  
Miodrag Petkovic ◽  
Jovana Dzunic ◽  
Ljiljana Petkovic

An efficient family of two-point derivative free methods with memory for solving nonlinear equations is presented. It is proved that the convergence order of the proposed family is increased from 4 to at least 2 + ?6 ? 4.45, 5, 1/2 (5 + ?33) ? 5.37 and 6, depending on the accelerating technique. The increase of convergence order is attained using a suitable accelerating technique by varying a free parameter in each iteration. The improvement of convergence rate is achieved without any additional function evaluations meaning that the proposed methods with memory are very efficient. Moreover, the presented methods are more efficient than all existing methods known in literature in the class of two-point methods and three-point methods of optimal order eight. Numerical examples and the comparison with the existing two-point methods are included to confirm theoretical results and high computational efficiency. 2010 Mathematics Subject Classification. 65H05


2014 ◽  
Vol 11 (05) ◽  
pp. 1350078 ◽  
Author(s):  
XIAOFENG WANG ◽  
TIE ZHANG

In this paper, we present some three-point Newton-type iterative methods without memory for solving nonlinear equations by using undetermined coefficients method. The order of convergence of the new methods without memory is eight requiring the evaluations of three functions and one first-order derivative in per full iteration. Hence, the new methods are optimal according to Kung and Traubs conjecture. Based on the presented methods without memory, we present two families of Newton-type iterative methods with memory. Further accelerations of convergence speed are obtained by using a self-accelerating parameter. This self-accelerating parameter is calculated by the Hermite interpolating polynomial and is applied to improve the order of convergence of the Newton-type method. The corresponding R-order of convergence is increased from 8 to 9, [Formula: see text] and 10. The increase of convergence order is attained without any additional calculations so that the two families of the methods with memory possess a very high computational efficiency. Numerical examples are demonstrated to confirm theoretical results.


Mathematics ◽  
2020 ◽  
Vol 8 (1) ◽  
pp. 108 ◽  
Author(s):  
Xiaofeng Wang ◽  
Yuxi Tao

A new Newton method with memory is proposed by using a variable self-accelerating parameter. Firstly, a modified Newton method without memory with invariant parameter is constructed for solving nonlinear equations. Substituting the invariant parameter of Newton method without memory by a variable self-accelerating parameter, we obtain a novel Newton method with memory. The convergence order of the new Newton method with memory is 1 + 2 . The acceleration of the convergence rate is attained without any additional function evaluations. The main innovation is that the self-accelerating parameter is constructed by a simple way. Numerical experiments show the presented method has faster convergence speed than existing methods.


2014 ◽  
Vol 2014 ◽  
pp. 1-6 ◽  
Author(s):  
M. Sharifi ◽  
S. Karimi Vanani ◽  
F. Khaksar Haghani ◽  
M. Arab ◽  
S. Shateyi

The purpose of this paper is to derive and discuss a three-step iterative expression for solving nonlinear equations. In fact, we derive a derivative-free form for one of the existing optimal eighth-order methods and preserve its convergence order. Theoretical results will be upheld by numerical experiments.


2015 ◽  
Vol 2015 ◽  
pp. 1-5
Author(s):  
M. Sharifi ◽  
S. Karimi Vanani ◽  
F. Khaksar Haghani ◽  
M. Arab ◽  
S. Shateyi

The aim of this paper is to construct a method with memory according to King’s family of methods without memory for nonlinear equations. It is proved that the proposed method possesses higherR-order of convergence using the same number of functional evaluations as King’s family. Numerical experiments are given to illustrate the performance of the constructed scheme.


Mathematics ◽  
2019 ◽  
Vol 8 (1) ◽  
pp. 2
Author(s):  
Santiago Artidiello ◽  
Alicia Cordero ◽  
Juan R. Torregrosa ◽  
María P. Vassileva

A secant-type method is designed for approximating the inverse and some generalized inverses of a complex matrix A. For a nonsingular matrix, the proposed method gives us an approximation of the inverse and, when the matrix is singular, an approximation of the Moore–Penrose inverse and Drazin inverse are obtained. The convergence and the order of convergence is presented in each case. Some numerical tests allowed us to confirm the theoretical results and to compare the performance of our method with other known ones. With these results, the iterative methods with memory appear for the first time for estimating the solution of a nonlinear matrix equations.


Filomat ◽  
2021 ◽  
Vol 35 (3) ◽  
pp. 723-730
Author(s):  
Wei Ma ◽  
Liuqing Hua

In this paper, we present a two-step Ulm-type method to solve systems of nonlinear equations without computing Jacobian matrices and solving Jacobian equations. we prove that the two-step Ulm-type method converges locally to the solution with R-convergence rate 3. Numerical implementations demonstrate the effectiveness of the new method.


2013 ◽  
Vol 7 (2) ◽  
pp. 390-403 ◽  
Author(s):  
Janak Sharma ◽  
Himani Arora

We present a derivative free method of fourth order convergence for solving systems of nonlinear equations. The method consists of two steps of which first step is the well-known Traub's method. First-order divided difference operator for functions of several variables and direct computation by Taylor's expansion are used to prove the local convergence order. Computational efficiency of new method in its general form is discussed and is compared with existing methods of similar nature. It is proved that for large systems the new method is more efficient. Some numerical tests are performed to compare proposed method with existing methods and to confirm the theoretical results.


2016 ◽  
Vol 11 (10) ◽  
pp. 5774-5780
Author(s):  
Rajinder Thukral

New one-point iterative method for solving nonlinear equations is constructed.  It is proved that the new method has the convergence order of three. Per iteration the new method requires two evaluations of the function.  Kung and Traub conjectured that the multipoint iteration methods, without memory based on n evaluations, could achieve maximum convergence order2n-1  but, the new method produces convergence order of three, which is better than expected maximum convergence order of two.  Hence, we demonstrate that the conjecture fails for a particular set of nonlinear equations. Numerical comparisons are included to demonstrate exceptional convergence speed of the proposed method using only a few function evaluations.


2019 ◽  
Vol 13 (2) ◽  
pp. 399-422
Author(s):  
Miodrag Petkovic ◽  
Ljiljana Petkovic ◽  
Beny Neta

Generalized Halley-like one-parameter families of order three and four for finding multiple root of a nonlinear equation are constructed and studied. This presentation is, actually, a mixture of theoretical results, algorithmic aspects, numerical experiments, and computer graphics. Starting from the proposed class of third order methods and using an accelerating procedure, we construct a new fourth order family of Halley's type. To analyze convergence behavior of two presented families, we have used two methodologies: (i) testing by numerical examples and (ii) dynamic study using basins of attraction.


Sign in / Sign up

Export Citation Format

Share Document