scholarly journals Global Convergence Analysis of a Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems

Author(s):  
Ibrahim Abdullahi ◽  
Rohanin Ahmad

2011 ◽  
Vol 2011 ◽  
pp. 1-22
Author(s):  
Liu Jin-kui ◽  
Zou Li-min ◽  
Song Xiao-qian

A modified PRP nonlinear conjugate gradient method to solve unconstrained optimization problems is proposed. The important property of the proposed method is that the sufficient descent property is guaranteed independent of any line search. By the use of the Wolfe line search, the global convergence of the proposed method is established for nonconvex minimization. Numerical results show that the proposed method is effective and promising by comparing with the VPRP, CG-DESCENT, and DL+methods.



2018 ◽  
Vol 13 (03) ◽  
pp. 2050059
Author(s):  
Amina Boumediene ◽  
Rachid Benzine ◽  
Mohammed Belloufi

Nonlinear conjugate gradient (CG) methods are widely used for solving large scale unconstrained optimization problems. Many studies have been devoted to develop and improve these methods. In this paper, we aim to study the global convergence of the BBB conjugate gradient method with exact line search.



2012 ◽  
Vol 2012 ◽  
pp. 1-14
Author(s):  
Yang Yueting ◽  
Cao Mingyuan

We propose and generalize a new nonlinear conjugate gradient method for unconstrained optimization. The global convergence is proved with the Wolfe line search. Numerical experiments are reported which support the theoretical analyses and show the presented methods outperforming CGDESCENT method.



Sign in / Sign up

Export Citation Format

Share Document