Global convergence of the DY conjugate gradient method with Armijo line search for unconstrained optimization problems

2007 ◽  
Vol 22 (3) ◽  
pp. 511-517 ◽  
Author(s):  
Li Zhang ◽  
Weijun Zhou ◽  
Donghui Li
Author(s):  
Pro Kaelo ◽  
Sindhu Narayanan ◽  
M.V. Thuto

This article presents a modified quadratic hybridization of the Polak–Ribiere–Polyak and Fletcher–Reeves conjugate gradient method for solving unconstrained optimization problems. Global convergence, with the strong Wolfe line search conditions, of the proposed quadratic hybrid conjugate gradient method is established. We also report some numerical results to show the competitiveness of the new hybrid method.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 92
Author(s):  
Talat Alkouli ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Puspa Liza Ghazali

In this paper, an efficient modification of nonlinear conjugate gradient method and an associated implementation, based on an exact line search, are proposed and analyzed to solve large-scale unconstrained optimization problems. The method satisfies the sufficient descent property. Furthermore, global convergence result is proved. Computational results for a set of unconstrained optimization test problems, some of them from CUTE library, showed that this new conjugate gradient algorithm seems to converge more stable and outperforms the other similar methods in many situations.   


2019 ◽  
Vol 2019 ◽  
pp. 1-9
Author(s):  
Jiankun Liu ◽  
Shouqiang Du

We propose a modified three-term conjugate gradient method with the Armijo line search for solving unconstrained optimization problems. The proposed method possesses the sufficient descent property. Under mild assumptions, the global convergence property of the proposed method with the Armijo line search is proved. Due to simplicity, low storage, and nice convergence properties, the proposed method is used to solve M-tensor systems and a kind of nonsmooth optimization problems with l1-norm. Finally, the given numerical experiments show the efficiency of the proposed method.


2012 ◽  
Vol 2012 ◽  
pp. 1-13 ◽  
Author(s):  
Huabin Jiang ◽  
Songhai Deng ◽  
Xiaodong Zheng ◽  
Zhong Wan

A modified spectral PRP conjugate gradient method is presented for solving unconstrained optimization problems. The constructed search direction is proved to be a sufficiently descent direction of the objective function. With an Armijo-type line search to determinate the step length, a new spectral PRP conjugate algorithm is developed. Under some mild conditions, the theory of global convergence is established. Numerical results demonstrate that this algorithm is promising, particularly, compared with the existing similar ones.


2018 ◽  
Vol 7 (2.14) ◽  
pp. 21
Author(s):  
Omar Alshorman ◽  
Mustafa Mamat ◽  
Ahmad Alhawarat ◽  
Mohd Revaie

The Conjugate Gradient (CG) methods play an important role in solving large-scale unconstrained optimization problems. Several studies have been recently devoted to improving and modifying these methods in relation to efficiency and robustness. In this paper, a new parameter of CG method has been proposed. The new parameter possesses global convergence properties under the Strong Wolfe-Powell (SWP) line search. The numerical results show that the proposed formula is more efficient and robust compared with Polak-Rribiere Ployak (PRP), Fletcher-Reeves (FR) and Wei, Yao, and Liu (WYL) parameters.  


2011 ◽  
Vol 2011 ◽  
pp. 1-22
Author(s):  
Liu Jin-kui ◽  
Zou Li-min ◽  
Song Xiao-qian

A modified PRP nonlinear conjugate gradient method to solve unconstrained optimization problems is proposed. The important property of the proposed method is that the sufficient descent property is guaranteed independent of any line search. By the use of the Wolfe line search, the global convergence of the proposed method is established for nonconvex minimization. Numerical results show that the proposed method is effective and promising by comparing with the VPRP, CG-DESCENT, and DL+methods.


Author(s):  
Ibrahim Abdullahi ◽  
Rohanin Ahmad

In this paper, we propose a new hybrid conjugate gradient method for unconstrained optimization problems. The proposed method comprises of beta (DY), beta (WHY), beta (RAMI)  and beta (New). The beta (New)  was constructed purposely for this proposed hybrid method.The method possesses sufficient descent property irrespective of the line search. Under Strong Wolfe-Powell line search, we proved that the method is globally convergent. Numerical experimentation shows the effectiveness and robustness of the proposed method when compare with some hybrid as well as some modified conjugate gradient methods.


Sign in / Sign up

Export Citation Format

Share Document