A new class of nonlinear conjugate gradient coefficients for unconstrained optimization
Keyword(s):
The nonlinear Conjugate gradient method (CGM) is a very effective way in solving large-scale optimization problems. Zhang et al. proposed a new CG coefficient which is defined by [Formula: see text]. They proved the sufficient descent condition and the global convergence for nonconvex minimization in strong Wolfe line search. In this paper, we prove that this CG coefficient possesses sufficient descent conditions and global convergence properties under the exact line search.
2011 ◽
Vol 18
(9)
◽
pp. 1249-1253
◽
2018 ◽
Vol 7
(3.28)
◽
pp. 54
2019 ◽
Vol 8
(2S3)
◽
pp. 368-373
2018 ◽
Vol 11
(3)
◽
pp. 1188
2018 ◽
Vol 7
(4.30)
◽
pp. 458
2018 ◽
Vol 13
(03)
◽
pp. 2050059
2021 ◽
2021 ◽
Vol 11
(2)
◽
pp. 1469