Some global convergence properties of the Wei–Yao–Liu conjugate gradient method with inexact line search

2011 ◽  
Vol 217 (17) ◽  
pp. 7132-7137 ◽  
Author(s):  
Sha Lu ◽  
Zengxin Wei ◽  
Liliu Mo
Author(s):  
Chergui Ahmed ◽  
Bouali Tahar

In this paper, We propose a new nonlinear conjugate gradient method (FRA) that satisfies a sufficient descent condition and global convergence under the inexact line search of strong wolf powell. Our numerical experiment shaw the efficiency of the new method in solving a set of problems from the CUTEst package, the proposed new formula gives excellent numerical results at CPU time, number of iterations, number of gradient ratings when compared to WYL, DY, PRP, and FR methods.


2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat

This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees global convergence using the inexact Wolfe–Powell line search. The numerical efficiency of the modified three-term HS method is checked using 75 standard test functions. It is known that three-term conjugate gradient methods are numerically more efficient than two-term conjugate gradient methods. Importantly, this paper quantifies how much better the three-term performance is compared with two-term methods. Thus, in the numerical results, we compare our new modification with an efficient two-term conjugate gradient method. We also compare our modification with a state-of-the-art three-term HS method. Finally, we conclude that our proposed modification is globally convergent and numerically efficient.


2021 ◽  
Vol 5 (1) ◽  
pp. 47
Author(s):  
Sindy Devila ◽  
Maulana Malik ◽  
Wed Giyarti

In this paper, we propose a new hybrid coefficient of conjugate gradient method (CG) for solving unconstrained optimization model.  The new coefficient is combination of part the MMSIS (Malik et.al, 2020) and PRP (Polak, Ribi'ere \& Polyak, 1969) coefficients.  Under exact line search, the search direction of new method satisfies the sufficient descent condition and based on certain assumption, we establish the global convergence properties.  Using some test functions, numerical results show that the proposed method is more efficient than MMSIS method.  Besides, the new method can be used to solve problem in minimizing portfolio selection risk .


Sign in / Sign up

Export Citation Format

Share Document