A derivative-free line search and dfp method for symmetric equations with global and superlinear convergence

1999 ◽  
Vol 20 (1-2) ◽  
pp. 59-77 ◽  
Author(s):  
Li Donghui ◽  
Masao Fukushima
Author(s):  
Anderas Griewank

Iterative methods for solving a square system of nonlinear equations g(x) = 0 often require that the sum of squares residual γ (x) ≡ ½∥g(x)∥2 be reduced at each step. Since the gradient of γ depends on the Jacobian ∇g, this stabilization strategy is not easily implemented if only approximations Bk to ∇g are available. Therefore most quasi-Newton algorithms either include special updating steps or reset Bk to a divided difference estimate of ∇g whenever no satisfactory progress is made. Here the need for such back-up devices is avoided by a derivative-free line search in the range of g. Assuming that the Bk are generated from an rbitrary B0 by fixed scale updates, we establish superlinear convergence from within any compact level set of γ on which g has a differentiable inverse function g−1.


2011 ◽  
Vol 18 (9) ◽  
pp. 1303-1309 ◽  
Author(s):  
Zhaocheng Cui ◽  
Boying Wu

In this paper, we propose a new self-adaptive trust region method for unconstrained optimization problems and develop some convergence properties. In our algorithm, we use the previous and current iterative information to define a suitable trust region radius at each iteration. The global and superlinear convergence properties of the algorithm are established under reasonable assumptions. Preliminary numerical results show that the new method is efficient and attractive for solving unconstrained optimization problems.


2019 ◽  
Vol 2 (3) ◽  
pp. 1-4
Author(s):  
Abubakar Sani Halilu ◽  
M K Dauda ◽  
M Y Waziri ◽  
M Mamat

An algorithm for solving large-scale systems of nonlinear equations based on the transformation of the Newton method with the line search into a derivative-free descent method is introduced. Main idea used in the algorithm construction is to approximate the Jacobian by an appropriate diagonal matrix. Furthermore, the step length is calculated using inexact line search procedure. Under appropriate conditions, the proposed method is proved to be globally convergent under mild conditions. The numerical results presented show the efficiency of the proposed method.


Mathematics ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 168 ◽  
Author(s):  
Zhifeng Dai ◽  
Huan Zhu

The goal of this paper is to extend the modified Hestenes-Stiefel method to solve large-scale nonlinear monotone equations. The method is presented by combining the hyperplane projection method (Solodov, M.V.; Svaiter, B.F. A globally convergent inexact Newton method for systems of monotone equations, in: M. Fukushima, L. Qi (Eds.)Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, Kluwer Academic Publishers. 1998, 355-369) and the modified Hestenes-Stiefel method in Dai and Wen (Dai, Z.; Wen, F. Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search. Numer Algor. 2012, 59, 79-93). In addition, we propose a new line search for the derivative-free method. Global convergence of the proposed method is established if the system of nonlinear equations are Lipschitz continuous and monotone. Preliminary numerical results are given to test the effectiveness of the proposed method.


Sign in / Sign up

Export Citation Format

Share Document