A descent nonlinear conjugate gradient method for large-scale unconstrained optimization

2007 ◽  
Vol 187 (2) ◽  
pp. 636-643 ◽  
Author(s):  
Gaohang Yu ◽  
Yanlin Zhao ◽  
Zengxin Wei
2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Shengwei Yao ◽  
Xiwen Lu ◽  
Bin Qin

The conjugate gradient (CG) method has played a special role in solving large-scale nonlinear optimization problems due to the simplicity of their very low memory requirements. In this paper, we propose a new conjugacy condition which is similar to Dai-Liao (2001). Based on this condition, the related nonlinear conjugate gradient method is given. With some mild conditions, the given method is globally convergent under the strong Wolfe-Powell line search for general functions. The numerical experiments show that the proposed method is very robust and efficient.


2015 ◽  
Vol 9 ◽  
pp. 2671-2682 ◽  
Author(s):  
Ibrahim S. Mohammed ◽  
Mustafa Mamat ◽  
Abdelrhaman Abashar ◽  
Mohd Rivaie ◽  
Zabidin Salleh

Algorithms ◽  
2021 ◽  
Vol 14 (8) ◽  
pp. 227
Author(s):  
Zabidin Salleh ◽  
Ghaliah Alhamzi ◽  
Ibitsam Masmali ◽  
Ahmad Alhawarat

The conjugate gradient method is one of the most popular methods to solve large-scale unconstrained optimization problems since it does not require the second derivative, such as Newton’s method or approximations. Moreover, the conjugate gradient method can be applied in many fields such as neural networks, image restoration, etc. Many complicated methods are proposed to solve these optimization functions in two or three terms. In this paper, we propose a simple, easy, efficient, and robust conjugate gradient method. The new method is constructed based on the Liu and Storey method to overcome the convergence problem and descent property. The new modified method satisfies the convergence properties and the sufficient descent condition under some assumptions. The numerical results show that the new method outperforms famous CG methods such as CG-Descent5.3, Liu and Storey, and Dai and Liao. The numerical results include the number of iterations and CPU time.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 92
Author(s):  
Talat Alkouli ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Puspa Liza Ghazali

In this paper, an efficient modification of nonlinear conjugate gradient method and an associated implementation, based on an exact line search, are proposed and analyzed to solve large-scale unconstrained optimization problems. The method satisfies the sufficient descent property. Furthermore, global convergence result is proved. Computational results for a set of unconstrained optimization test problems, some of them from CUTE library, showed that this new conjugate gradient algorithm seems to converge more stable and outperforms the other similar methods in many situations.   


Sign in / Sign up

Export Citation Format

Share Document