A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations

2017 ◽  
Vol 95 (2) ◽  
pp. 382-395 ◽  
Author(s):  
Xiangrong Li ◽  
Xiaoliang Wang ◽  
Zhou Sheng ◽  
Xiabin Duan
2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Xiangrong Li ◽  
Songhua Wang ◽  
Zhongzhou Jin ◽  
Hongtruong Pham

This paper gives a modified Hestenes and Stiefel (HS) conjugate gradient algorithm under the Yuan-Wei-Lu inexact line search technique for large-scale unconstrained optimization problems, where the proposed algorithm has the following properties: (1) the new search direction possesses not only a sufficient descent property but also a trust region feature; (2) the presented algorithm has global convergence for nonconvex functions; (3) the numerical experiment showed that the new algorithm is more effective than similar algorithms.


Author(s):  
M. Y. Waziri ◽  
L. Muhammad ◽  
J. Sabi’u

<p>This paper presents a simple three-terms Conjugate Gradient algorithm for solving Large-Scale systems of nonlinear equations without computing Jacobian and gradient via the special structure of the underlying function. This three term CG of the proposed method has an advantage of solving relatively large-scale problems, with lower storage requirement compared to some existing methods. By incoporating the Powel restart approach in to the algorithm, we prove the global convergence of the proposed method with a derivative free line search under suitable assumtions. The numerical results are presented which show that the proposed method is promising.</p>


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Xiangrong Li ◽  
Xiaoliang Wang ◽  
Xiabin Duan

A limited memory BFGS (L-BFGS) algorithm is presented for solving large-scale symmetric nonlinear equations, where a line search technique without derivative information is used. The global convergence of the proposed algorithm is established under some suitable conditions. Numerical results show that the given method is competitive to those of the normal BFGS methods.


Author(s):  
Mohammed Belloufi ◽  
Rachid Benzine ◽  
Laskri Yamina

The Hestenes-Stiefel (HS) conjugate gradient algorithm is a useful tool of unconstrainednumerical optimization, which has good numerical performance but no global convergence result under traditional line searches. This paper proposes a line search technique that guarantee the globalconvergence of the Hestenes-Stiefel (HS) conjugate gradient method. Numerical tests are presented tovalidate the different approaches.


Sign in / Sign up

Export Citation Format

Share Document