Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems

2008 ◽  
Vol 3 (1) ◽  
pp. 11-21 ◽  
Author(s):  
Gonglin Yuan
2018 ◽  
Vol 7 (2.14) ◽  
pp. 25 ◽  
Author(s):  
Syazni Shoid ◽  
Norrlaili Shapiee ◽  
Norhaslinda Zull ◽  
Nur Hamizah Abdul Ghani ◽  
Nur Syarafina Mohamed ◽  
...  

Many researchers are intended to improve the conjugate gradient (CG) methods as well as their applications in real life. Besides, CG become more interesting and useful in many disciplines and has important role for solving large-scale optimization problems. In this paper, three types of new CG coefficients are presented with application in estimating data. Numerical experiments show that the proposed methods have succeeded in solving problems under strong Wolfe Powell line search conditions. 


2018 ◽  
Vol 7 (4.30) ◽  
pp. 458
Author(s):  
Srimazzura Basri ◽  
Mustafa Mamat ◽  
Puspa Liza Ghazali

Non-linear conjugate gradient methods has been widely used instrumental in solving large scale optimization. These methods has been proved that only required very low memory other than its numerical efficiency. Thus, many studies have been conducted to improve these methods to find the most efficient method. In this paper, we proposed a new non-linear conjugate gradient coefficient that guarantees sufficient descent condition. Numerical tests indicate that the proposed coefficient is better than the three classical conjugate gradient coefficients.


2009 ◽  
Vol 2009 ◽  
pp. 1-16 ◽  
Author(s):  
Jianguo Zhang ◽  
Yunhai Xiao ◽  
Zengxin Wei

Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed. An attractive property of the methods, is that, without any line search, the generated directions always descend. Under some mild conditions, global convergence results for both methods are established. Preliminary numerical results show that these proposed methods are promising, and competitive with the well-known PRP method.


2013 ◽  
Vol 2013 ◽  
pp. 1-5 ◽  
Author(s):  
Yuan-Yuan Chen ◽  
Shou-Qiang Du

Nonlinear conjugate gradient method is one of the useful methods for unconstrained optimization problems. In this paper, we consider three kinds of nonlinear conjugate gradient methods with Wolfe type line search for unstrained optimization problems. Under some mild assumptions, the global convergence results of the given methods are proposed. The numerical results show that the nonlinear conjugate gradient methods with Wolfe type line search are efficient for some unconstrained optimization problems.


Author(s):  
O.B. Akinduko

In this paper, by linearly combining the numerator and denominator terms of the Dai-Liao (DL) and Bamigbola-Ali-Nwaeze (BAN) conjugate gradient methods (CGMs), a general form of DL-BAN method has been proposed. From this general form, a new hybrid CGM, which was found to possess a sufficient descent property is generated. Numerical experiment was carried out on the new CGM in comparison with four existing CGMs, using some set of large scale unconstrained optimization problems. The result showed a superior performance of new method over majority of the existing methods.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Shengwei Yao ◽  
Bin Qin

The conjugate gradient method is an efficient method for solving large-scale nonlinear optimization problems. In this paper, we propose a nonlinear conjugate gradient method which can be considered as a hybrid of DL and WYL conjugate gradient methods. The given method possesses the sufficient descent condition under the Wolfe-Powell line search and is globally convergent for general functions. Our numerical results show that the proposed method is very robust and efficient for the test problems.


Sign in / Sign up

Export Citation Format

Share Document