A convergent modified HS-DY hybrid conjugate gradient method for unconstrained optimization problems

2018 ◽  
Vol 40 (1) ◽  
pp. 97-113 ◽  
Author(s):  
Peter Mtagulwa ◽  
P. Kaelo
Author(s):  
Pro Kaelo ◽  
Sindhu Narayanan ◽  
M.V. Thuto

This article presents a modified quadratic hybridization of the Polak–Ribiere–Polyak and Fletcher–Reeves conjugate gradient method for solving unconstrained optimization problems. Global convergence, with the strong Wolfe line search conditions, of the proposed quadratic hybrid conjugate gradient method is established. We also report some numerical results to show the competitiveness of the new hybrid method.


Author(s):  
Ibrahim Abdullahi ◽  
Rohanin Ahmad

In this paper, we propose a new hybrid conjugate gradient method for unconstrained optimization problems. The proposed method comprises of beta (DY), beta (WHY), beta (RAMI)  and beta (New). The beta (New)  was constructed purposely for this proposed hybrid method.The method possesses sufficient descent property irrespective of the line search. Under Strong Wolfe-Powell line search, we proved that the method is globally convergent. Numerical experimentation shows the effectiveness and robustness of the proposed method when compare with some hybrid as well as some modified conjugate gradient methods.


Sign in / Sign up

Export Citation Format

Share Document