A NEW DERIVATIVE-FREE CONJUGATE GRADIENT METHOD FOR LARGE-SCALE NONLINEAR SYSTEMS OF EQUATIONS

2017 ◽  
Vol 95 (3) ◽  
pp. 500-511 ◽  
Author(s):  
XIAOWEI FANG ◽  
QIN NI

We propose a new derivative-free conjugate gradient method for large-scale nonlinear systems of equations. The method combines the Rivaie–Mustafa–Ismail–Leong conjugate gradient method for unconstrained optimisation problems and a new nonmonotone line-search method. The global convergence of the proposed method is established under some mild assumptions. Numerical results using 104 test problems from the CUTEst test problem library show that the proposed method is promising.

2020 ◽  
Vol 3 (1) ◽  
pp. 43-49
Author(s):  
M K Dauda

In this study, a fully derivative-free method for solving large scale nonlinear systems of equations via memoryless DFP update is presented. The new proposed method is an enhanced DFP (Davidon-FletcherPowell) update which is matrix and derivative free thereby require low memory storage. Under suitable conditions, the proposed method converges globally. Numerical comparisons using a set of large-scale test problems showed that the proposed method is efficient.


2020 ◽  
Vol 2020 ◽  
pp. 1-10
Author(s):  
Zhenhua Su ◽  
Min Li

In this paper, a descent Liu–Storey conjugate gradient method is extended to solve large-scale nonlinear systems of equations. Based on certain assumptions, the global convergence property is obtained with a nonmonotone line search. The proposed method is suitable to solve large-scale problems for the low-storage requirement. Numerical experiment results show that the new method is practically effective.


2014 ◽  
Vol 989-994 ◽  
pp. 2406-2409
Author(s):  
Ting Feng Li ◽  
Zhi Yuan Liu ◽  
Zhao Bin Du

In this paper, we introduce an algorithm for solving large-scale box-constrained optimization problems. At each iteration of the proposed algorithm, we first estimate the active set by means of an active set identification technique. The components of the search direction corresponding to the active set are simply defined; the other components are determined by nonlinear conjugate gradient method. Under some additional conditions, we show that the algorithm converges globally. We also report some preliminary numerical experiments to show that the proposed algorithm is practicable and effective for the test problems.


2013 ◽  
Vol 2013 ◽  
pp. 1-9 ◽  
Author(s):  
Shengwei Yao ◽  
Xiwen Lu ◽  
Zengxin Wei

The conjugate gradient (CG) method has played a special role in solving large-scale nonlinear optimization problems due to the simplicity of their very low memory requirements. This paper proposes a conjugate gradient method which is similar to Dai-Liao conjugate gradient method (Dai and Liao, 2001) but has stronger convergence properties. The given method possesses the sufficient descent condition, and is globally convergent under strong Wolfe-Powell (SWP) line search for general function. Our numerical results show that the proposed method is very efficient for the test problems.


2014 ◽  
Vol 2014 ◽  
pp. 1-7
Author(s):  
Min Sun ◽  
Jing Liu

Recently, Zhang et al. proposed a sufficient descent Polak-Ribière-Polyak (SDPRP) conjugate gradient method for large-scale unconstrained optimization problems and proved its global convergence in the sense thatlim infk→∞∥∇f(xk)∥=0when an Armijo-type line search is used. In this paper, motivated by the line searches proposed by Shi et al. and Zhang et al., we propose two new Armijo-type line searches and show that the SDPRP method has strong convergence in the sense thatlimk→∞∥∇f(xk)∥=0under the two new line searches. Numerical results are reported to show the efficiency of the SDPRP with the new Armijo-type line searches in practical computation.


Sign in / Sign up

Export Citation Format

Share Document