scholarly journals A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations

2009 ◽  
Vol 224 (1) ◽  
pp. 11-19 ◽  
Author(s):  
Wanyou Cheng ◽  
Yunhai Xiao ◽  
Qing-Jie Hu
2017 ◽  
Vol 95 (3) ◽  
pp. 500-511 ◽  
Author(s):  
XIAOWEI FANG ◽  
QIN NI

We propose a new derivative-free conjugate gradient method for large-scale nonlinear systems of equations. The method combines the Rivaie–Mustafa–Ismail–Leong conjugate gradient method for unconstrained optimisation problems and a new nonmonotone line-search method. The global convergence of the proposed method is established under some mild assumptions. Numerical results using 104 test problems from the CUTEst test problem library show that the proposed method is promising.


2020 ◽  
Vol 2020 ◽  
pp. 1-10
Author(s):  
Zhenhua Su ◽  
Min Li

In this paper, a descent Liu–Storey conjugate gradient method is extended to solve large-scale nonlinear systems of equations. Based on certain assumptions, the global convergence property is obtained with a nonmonotone line search. The proposed method is suitable to solve large-scale problems for the low-storage requirement. Numerical experiment results show that the new method is practically effective.


2014 ◽  
Vol 2014 ◽  
pp. 1-14
Author(s):  
San-Yang Liu ◽  
Yuan-Yuan Huang

This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent conditiongkTdk≤-1-1/4θkgk2  θk>1/4and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization.


Sign in / Sign up

Export Citation Format

Share Document