A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization

2017 ◽  
Vol 79 (1) ◽  
pp. 195-219 ◽  
Author(s):  
Ming Li ◽  
Hongwei Liu ◽  
Zexian Liu
Author(s):  
Pro Kaelo ◽  
Sindhu Narayanan ◽  
M.V. Thuto

This article presents a modified quadratic hybridization of the Polak–Ribiere–Polyak and Fletcher–Reeves conjugate gradient method for solving unconstrained optimization problems. Global convergence, with the strong Wolfe line search conditions, of the proposed quadratic hybrid conjugate gradient method is established. We also report some numerical results to show the competitiveness of the new hybrid method.


2012 ◽  
Vol 2012 ◽  
pp. 1-12 ◽  
Author(s):  
Jinkui Liu ◽  
Youyi Jiang

A new nonlinear spectral conjugate descent method for solving unconstrained optimization problems is proposed on the basis of the CD method and the spectral conjugate gradient method. For any line search, the new method satisfies the sufficient descent conditiongkTdk<−∥gk∥2. Moreover, we prove that the new method is globally convergent under the strong Wolfe line search. The numerical results show that the new method is more effective for the given test problems from the CUTE test problem library (Bongartz et al., 1995) in contrast to the famous CD method, FR method, and PRP method.


2016 ◽  
Vol 10 ◽  
pp. 721-734 ◽  
Author(s):  
Mohamed Hamoda ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Zabidin Salleh

Author(s):  
Chenna Nasreddine ◽  
Sellami Badreddine ◽  
Belloufi Mohammed

In this paper, we present a new hybrid method to solve a nonlinear unconstrained optimization problem by using conjugate gradient, which is a convex combination of Liu–Storey (LS) conjugate gradient method and Hager–Zhang (HZ) conjugate gradient method. This method possesses the sufficient descent property with Strong Wolfe line search and the global convergence with the strong Wolfe line search. In the end of this paper, we illustrate our method by giving some numerical examples.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 92
Author(s):  
Talat Alkouli ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Puspa Liza Ghazali

In this paper, an efficient modification of nonlinear conjugate gradient method and an associated implementation, based on an exact line search, are proposed and analyzed to solve large-scale unconstrained optimization problems. The method satisfies the sufficient descent property. Furthermore, global convergence result is proved. Computational results for a set of unconstrained optimization test problems, some of them from CUTE library, showed that this new conjugate gradient algorithm seems to converge more stable and outperforms the other similar methods in many situations.   


Sign in / Sign up

Export Citation Format

Share Document