scholarly journals A New Newton Method with Memory for Solving Nonlinear Equations

Mathematics ◽  
2020 ◽  
Vol 8 (1) ◽  
pp. 108 ◽  
Author(s):  
Xiaofeng Wang ◽  
Yuxi Tao

A new Newton method with memory is proposed by using a variable self-accelerating parameter. Firstly, a modified Newton method without memory with invariant parameter is constructed for solving nonlinear equations. Substituting the invariant parameter of Newton method without memory by a variable self-accelerating parameter, we obtain a novel Newton method with memory. The convergence order of the new Newton method with memory is 1 + 2 . The acceleration of the convergence rate is attained without any additional function evaluations. The main innovation is that the self-accelerating parameter is constructed by a simple way. Numerical experiments show the presented method has faster convergence speed than existing methods.

2011 ◽  
Vol 5 (2) ◽  
pp. 298-317 ◽  
Author(s):  
Miodrag Petkovic ◽  
Jovana Dzunic ◽  
Ljiljana Petkovic

An efficient family of two-point derivative free methods with memory for solving nonlinear equations is presented. It is proved that the convergence order of the proposed family is increased from 4 to at least 2 + ?6 ? 4.45, 5, 1/2 (5 + ?33) ? 5.37 and 6, depending on the accelerating technique. The increase of convergence order is attained using a suitable accelerating technique by varying a free parameter in each iteration. The improvement of convergence rate is achieved without any additional function evaluations meaning that the proposed methods with memory are very efficient. Moreover, the presented methods are more efficient than all existing methods known in literature in the class of two-point methods and three-point methods of optimal order eight. Numerical examples and the comparison with the existing two-point methods are included to confirm theoretical results and high computational efficiency. 2010 Mathematics Subject Classification. 65H05


2016 ◽  
Vol 11 (10) ◽  
pp. 5774-5780
Author(s):  
Rajinder Thukral

New one-point iterative method for solving nonlinear equations is constructed.  It is proved that the new method has the convergence order of three. Per iteration the new method requires two evaluations of the function.  Kung and Traub conjectured that the multipoint iteration methods, without memory based on n evaluations, could achieve maximum convergence order2n-1  but, the new method produces convergence order of three, which is better than expected maximum convergence order of two.  Hence, we demonstrate that the conjecture fails for a particular set of nonlinear equations. Numerical comparisons are included to demonstrate exceptional convergence speed of the proposed method using only a few function evaluations.


Mathematics ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. 540 ◽  
Author(s):  
Xiaofeng Wang ◽  
Qiannan Fan

In this paper, a self-accelerating type method is proposed for solving nonlinear equations, which is a modified Ren’s method. A simple way is applied to construct a variable self-accelerating parameter of the new method, which does not increase any computational costs. The highest convergence order of new method is 2 + 6 ≈ 4.4495 . Numerical experiments are made to show the performance of the new method, which supports the theoretical results.


2014 ◽  
Vol 2014 ◽  
pp. 1-9
Author(s):  
T. Lotfi ◽  
K. Mahdiani ◽  
Z. Noori ◽  
F. Khaksar Haghani ◽  
S. Shateyi

A class of derivative-free methods without memory for approximating a simple zero of a nonlinear equation is presented. The proposed class uses four function evaluations per iteration with convergence order eight. Therefore, it is an optimal three-step scheme without memory based on Kung-Traub conjecture. Moreover, the proposed class has an accelerator parameter with the property that it can increase the convergence rate from eight to twelve without any new functional evaluations. Thus, we construct a with memory method that increases considerably efficiency index from81/4≈1.681to121/4≈1.861. Illustrations are also included to support the underlying theory.


2014 ◽  
Vol 2014 ◽  
pp. 1-6 ◽  
Author(s):  
Tahereh Eftekhari

Based on iterative methods without memory of eighth-order convergence proposed by Thukral (2012), some iterative methods with memory and high efficiency index are presented. We show that the order of convergence is increased without any additional function evaluations. Numerical comparisons are made to show the performance of the presented methods.


2015 ◽  
Vol 2015 ◽  
pp. 1-5
Author(s):  
M. Sharifi ◽  
S. Karimi Vanani ◽  
F. Khaksar Haghani ◽  
M. Arab ◽  
S. Shateyi

The aim of this paper is to construct a method with memory according to King’s family of methods without memory for nonlinear equations. It is proved that the proposed method possesses higherR-order of convergence using the same number of functional evaluations as King’s family. Numerical experiments are given to illustrate the performance of the constructed scheme.


2019 ◽  
Vol 4 (2) ◽  
pp. 34
Author(s):  
Deasy Wahyuni ◽  
Elisawati Elisawati

Newton method is one of the most frequently used methods to find solutions to the roots of nonlinear equations. Along with the development of science, Newton's method has undergone various modifications. One of them is the hasanov method and the newton method variant (vmn), with a higher order of convergence. In this journal focuses on the three-step iteration method in which the order of convergence is higher than the three methods. To find the convergence order of the three-step iteration method requires a program that can support the analytical results of both methods. One of them using the help of the matlab program. Which will then be compared with numerical simulations also using the matlab program.  Keywords : newton method, newton method variant, Hasanov Method and order of convergence


2022 ◽  
pp. 105140
Author(s):  
Ibrahim Mohammed Sulaiman ◽  
Mustafa Mamat ◽  
Maulana Malik ◽  
Kottakkaran Sooppy Nisar ◽  
Ashraf Elfasakhany

Author(s):  
Ştefan Măruşter

Abstract The aim of this paper is to investigate the local convergence of the Modified Newton method, i.e. the classical Newton method in which the first derivative is re-evaluated periodically after m steps. The convergence order is shown to be m + 1. A new algorithm is proposed for the estimation the convergence radius of the method. We propose also a threshold for the number of steps after which is recommended to re-evaluate the first derivative in the Modified Newton method.


Sign in / Sign up

Export Citation Format

Share Document