scholarly journals A New Hybrid Conjugate Gradient Method with Guaranteed Descent for Unconstraint Optimization

2018 ◽  
Vol 28 (3) ◽  
pp. 193 ◽  
Author(s):  
Basim A. Hassan

The conjugate gradient method an efficient technique for solving the unconstrained optimization problem. In this paper, we propose a new hybrid nonlinear conjugate gradient methods, which have the descent at every iteration and globally convergence properties under certain conditions. The numerical results show that new hybrid method are efficient for the given test problems.

2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Shengwei Yao ◽  
Bin Qin

The conjugate gradient method is an efficient method for solving large-scale nonlinear optimization problems. In this paper, we propose a nonlinear conjugate gradient method which can be considered as a hybrid of DL and WYL conjugate gradient methods. The given method possesses the sufficient descent condition under the Wolfe-Powell line search and is globally convergent for general functions. Our numerical results show that the proposed method is very robust and efficient for the test problems.


2018 ◽  
Vol 29 (1) ◽  
pp. 133
Author(s):  
Basim A. Hassan ◽  
Haneen A. Alashoor

The nonlinear conjugate gradient method is widely used to solve unconstrained optimization problems. In this paper the development of different versions of nonlinear conjugate gradient methods with global convergence properties proved. Numerical results indicated that the proposed method is very efficient.


2019 ◽  
Vol 14 (1) ◽  
pp. 1-9
Author(s):  
P. Kaelo ◽  
P. Mtagulwa ◽  
M. V. Thuto

Abstract In this paper, we develop a new hybrid conjugate gradient method that inherits the features of the Liu and Storey (LS), Hestenes and Stiefel (HS), Dai and Yuan (DY) and Conjugate Descent (CD) conjugate gradient methods. The new method generates a descent direction independently of any line search and possesses good convergence properties under the strong Wolfe line search conditions. Numerical results show that the proposed method is robust and efficient.


2017 ◽  
Vol 27 (5) ◽  
pp. 68
Author(s):  
Basim A. Hassan ◽  
Haneen A. Alashoor

In this paper, a new type nonlinear conjugate gradient method based on the ScaleMatrix is derived. The new method has the decent and globally convergentproperties under some assumptions. Numerical results indicate the efficiency ofthis method to solve the given test problems.


2000 ◽  
Vol 10 (2) ◽  
pp. 345-358 ◽  
Author(s):  
Yuhong Dai ◽  
Jiye Han ◽  
Guanghui Liu ◽  
Defeng Sun ◽  
Hongxia Yin ◽  
...  

2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Shengwei Yao ◽  
Xiwen Lu ◽  
Bin Qin

The conjugate gradient (CG) method has played a special role in solving large-scale nonlinear optimization problems due to the simplicity of their very low memory requirements. In this paper, we propose a new conjugacy condition which is similar to Dai-Liao (2001). Based on this condition, the related nonlinear conjugate gradient method is given. With some mild conditions, the given method is globally convergent under the strong Wolfe-Powell line search for general functions. The numerical experiments show that the proposed method is very robust and efficient.


2011 ◽  
Vol 58-60 ◽  
pp. 943-949
Author(s):  
Wan You Cheng ◽  
Xue Jie Liu

In this paper, on the basis of the recently developed HZ (Hager-Zhang) method [SIAM J. Optim., 16 (2005), pp. 170-192], we propose a hybrid descent conjugate gradient method which reserves the sufficient descent property of the HZ method. Under suitable conditions, we prove the global convergence of the proposed method. Extensive numerical experiments show that the method is promising for the test problems from the CUTE library.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Minglei Fang ◽  
Min Wang ◽  
Min Sun ◽  
Rong Chen

The nonlinear conjugate gradient algorithms are a very effective way in solving large-scale unconstrained optimization problems. Based on some famous previous conjugate gradient methods, a modified hybrid conjugate gradient method was proposed. The proposed method can generate decent directions at every iteration independent of any line search. Under the Wolfe line search, the proposed method possesses global convergence. Numerical results show that the modified method is efficient and robust.


Sign in / Sign up

Export Citation Format

Share Document