A derivative-free line search and global convergence of Broyden-like method for nonlinear equations

2000 ◽  
Vol 13 (3) ◽  
pp. 181-201 ◽  
Author(s):  
Dong-Hui Li ◽  
Masao Fukushima
Author(s):  
Jamilu Sabi'u

In this article, an enhanced conjugate gradient approach for solving symmetric nonlinear equations is propose without computing the Jacobian matrix. This approach is completely derivative and matrix free. Using classical assumptions the proposed method has global convergence with nonmonotone line search. Some reported numerical results shows the approach is promising.<p style="margin: 0px; text-indent: 0px; -qt-block-indent: 0; -qt-paragraph-type: empty;"> </p>


Author(s):  
Anderas Griewank

Iterative methods for solving a square system of nonlinear equations g(x) = 0 often require that the sum of squares residual γ (x) ≡ ½∥g(x)∥2 be reduced at each step. Since the gradient of γ depends on the Jacobian ∇g, this stabilization strategy is not easily implemented if only approximations Bk to ∇g are available. Therefore most quasi-Newton algorithms either include special updating steps or reset Bk to a divided difference estimate of ∇g whenever no satisfactory progress is made. Here the need for such back-up devices is avoided by a derivative-free line search in the range of g. Assuming that the Bk are generated from an rbitrary B0 by fixed scale updates, we establish superlinear convergence from within any compact level set of γ on which g has a differentiable inverse function g−1.


2019 ◽  
Vol 2 (3) ◽  
pp. 1-4
Author(s):  
Abubakar Sani Halilu ◽  
M K Dauda ◽  
M Y Waziri ◽  
M Mamat

An algorithm for solving large-scale systems of nonlinear equations based on the transformation of the Newton method with the line search into a derivative-free descent method is introduced. Main idea used in the algorithm construction is to approximate the Jacobian by an appropriate diagonal matrix. Furthermore, the step length is calculated using inexact line search procedure. Under appropriate conditions, the proposed method is proved to be globally convergent under mild conditions. The numerical results presented show the efficiency of the proposed method.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Abdulkarim Hassan Ibrahim ◽  
Poom Kumam ◽  
Auwal Bala Abubakar ◽  
Jamilu Abubakar

AbstractIn recent times, various algorithms have been incorporated with the inertial extrapolation step to speed up the convergence of the sequence generated by these algorithms. As far as we know, very few results exist regarding algorithms of the inertial derivative-free projection method for solving convex constrained monotone nonlinear equations. In this article, the convergence analysis of a derivative-free iterative algorithm (Liu and Feng in Numer. Algorithms 82(1):245–262, 2019) with an inertial extrapolation step for solving large scale convex constrained monotone nonlinear equations is studied. The proposed method generates a sufficient descent direction at each iteration. Under some mild assumptions, the global convergence of the sequence generated by the proposed method is established. Furthermore, some experimental results are presented to support the theoretical analysis of the proposed method.


Mathematics ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 168 ◽  
Author(s):  
Zhifeng Dai ◽  
Huan Zhu

The goal of this paper is to extend the modified Hestenes-Stiefel method to solve large-scale nonlinear monotone equations. The method is presented by combining the hyperplane projection method (Solodov, M.V.; Svaiter, B.F. A globally convergent inexact Newton method for systems of monotone equations, in: M. Fukushima, L. Qi (Eds.)Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, Kluwer Academic Publishers. 1998, 355-369) and the modified Hestenes-Stiefel method in Dai and Wen (Dai, Z.; Wen, F. Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search. Numer Algor. 2012, 59, 79-93). In addition, we propose a new line search for the derivative-free method. Global convergence of the proposed method is established if the system of nonlinear equations are Lipschitz continuous and monotone. Preliminary numerical results are given to test the effectiveness of the proposed method.


2012 ◽  
Vol 2012 ◽  
pp. 1-15 ◽  
Author(s):  
Zhong Jin ◽  
Yuqing Wang

An improved line search filter algorithm for the system of nonlinear equations is presented. We divide the equations into two groups, one contains the equations that are treated as equality constraints and the square of other equations is regarded as objective function. Two groups of equations are updated at every iteration in the works by Nie (2004, 2006, and 2006), by Nie et al. (2008), and by Gu (2011), while we just update them at the iterations when it is needed indeed. As a consequence, the scale of the calculation is decreased in a certain degree. Under some suitable conditions the global convergence can be induced. In the end, numerical experiments show that the method in this paper is effective.


2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
Huiping Cao

Schubert’s method is an extension of Broyden’s method for solving sparse nonlinear equations, which can preserve the zero-nonzero structure defined by the sparse Jacobian matrix and can retain many good properties of Broyden’s method. In particular, Schubert’s method has been proved to be locally andq-superlinearly convergent. In this paper, we globalize Schubert’s method by using a nonmonotone line search. Under appropriate conditions, we show that the proposed algorithm converges globally and superlinearly. Some preliminary numerical experiments are presented, which demonstrate that our algorithm is effective for large-scale problems.


Author(s):  
Mohammed Yusuf Waziri ◽  
Jamilu Sabi’u

We suggest a conjugate gradient (CG) method for solving symmetric systems of nonlinear equations without computing Jacobian and gradient via the special structure of the underlying function. This derivative-free feature of the proposed method gives it advantage to solve relatively large-scale problems (500,000 variables) with lower storage requirement compared to some existing methods. Under appropriate conditions, the global convergence of our method is reported. Numerical results on some benchmark test problems show that the proposed method is practically effective.


Symmetry ◽  
2021 ◽  
Vol 13 (2) ◽  
pp. 234
Author(s):  
Jamilu Sabi’u ◽  
Kanikar Muangchoo ◽  
Abdullah Shah ◽  
Auwal Bala Abubakar ◽  
Lateef Olakunle Jolaoso

Inspired by the large number of applications for symmetric nonlinear equations, this article will suggest two optimal choices for the modified Polak–Ribiére–Polyak (PRP) conjugate gradient (CG) method by minimizing the measure function of the search direction matrix and combining the proposed direction with the default Newton direction. In addition, the corresponding PRP parameters are incorporated with the Li and Fukushima approximate gradient to propose two robust CG-type algorithms for finding solutions for large-scale systems of symmetric nonlinear equations. We have also demonstrated the global convergence of the suggested algorithms using some classical assumptions. Finally, we demonstrated the numerical advantages of the proposed algorithms compared to some of the existing methods for nonlinear symmetric equations.


2021 ◽  
Vol 2021 ◽  
pp. 1-6
Author(s):  
Masoud Hatamian ◽  
Mahmoud Paripour ◽  
Farajollah Mohammadi Yaghoobi ◽  
Nasrin Karamikabir

In this article, a new nonmonotone line search technique is proposed for solving a system of nonlinear equations. We attempt to answer this question how to control the degree of the nonmonotonicity of line search rules in order to reach a more efficient algorithm? Therefore, we present a novel algorithm that can avoid the increase of unsuccessful iterations. For this purpose, we show the robust behavior of the proposed algorithm by solving a few numerical examples. Under some suitable assumptions, the global convergence of our strategy is proved.


Sign in / Sign up

Export Citation Format

Share Document