A new class of nonlinear conjugate gradient coefficients for unconstrained optimization

Author(s):  
Amina Boumediene ◽  
Tahar Bechouat ◽  
Rachid Benzine ◽  
Ghania Hadji

The nonlinear Conjugate gradient method (CGM) is a very effective way in solving large-scale optimization problems. Zhang et al. proposed a new CG coefficient which is defined by [Formula: see text]. They proved the sufficient descent condition and the global convergence for nonconvex minimization in strong Wolfe line search. In this paper, we prove that this CG coefficient possesses sufficient descent conditions and global convergence properties under the exact line search.

2011 ◽  
Vol 18 (9) ◽  
pp. 1249-1253 ◽  
Author(s):  
Mehdi Dehghan ◽  
Masoud Hajarian

The conjugate gradient method is one of the most useful and the earliest-discovered techniques for solving large-scale nonlinear optimization problems. Many variants of this method have been proposed, and some are widely used in practice. In this article, we study the descent Dai–Yuan conjugate gradient method which guarantees the sufficient descent condition for any line search. With exact line search, the introduced conjugate gradient method reduces to the Dai–Yuan conjugate gradient method. Finally, a global convergence result is established when the line search fulfils the Goldstein conditions.


2017 ◽  
Vol 2017 ◽  
pp. 1-12 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat ◽  
U. A. M. Roslan

A new modified three-term conjugate gradient (CG) method is shown for solving the large scale optimization problems. The idea relates to the famous Polak-Ribière-Polyak (PRP) formula. As the numerator of PRP plays a vital role in numerical result and not having the jamming issue, PRP method is not globally convergent. So, for the new three-term CG method, the idea is to use the PRP numerator and combine it with any good CG formula’s denominator that performs well. The new modification of three-term CG method possesses the sufficient descent condition independent of any line search. The novelty is that by using the Wolfe Powell line search the new modification possesses global convergence properties with convex and nonconvex functions. Numerical computation with the Wolfe Powell line search by using the standard test function of optimization shows the efficiency and robustness of the new modification.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 54
Author(s):  
Yasir Salih ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Abdelrhaman Abashar ◽  
Mohamad Afendee Mohamed

Conjugate Gradient (CG) method is a very useful technique for solving large-scale nonlinear optimization problems. In this paper, we propose a new formula for 12خ²k"> , which is a hybrid of PRP and WYL methods. This method possesses sufficient descent and global convergence properties when used with exact line search. Numerical results indicate that the new formula has higher efficiency compared with other classical CG methods. 


Lately, many large-scale unconstrained optimization problems rely upon nonlinear conjugate gradient (CG) methods. Many areas such as engineering, and computer science have benefited because of its simplicity, fast and low memory requirements. Many modified coefficients have appeared recently, all of which to improve these methods. This paper considers an extension conjugate gradient method of PolakRibière-Polyak using exact line search to show that it holds for some properties such as sufficient descent and global convergence. A set of 113 test problems is used to evaluate the performance of the proposed method and get compared to other existing methods using the same line search.


Author(s):  
Nur Syarafina Mohamed ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Shazlyn Milleana Shaharuddin

Nonlinear conjugate gradient (CG) methods are widely used in optimization field due to its efficiency for solving a large scale unconstrained optimization problems. Many studies and modifications have been developed in order to improve the method. The method is known to possess sufficient descend condition and its global convergence properties under strong Wolfe-Powell search direction. In this paper, the new coefficient of CG method is presented. The global convergence and sufficient descend properties of the new coefficient are established by using strong Wolfe-Powell line search direction. Results show that the new coefficient is able to globally converge under certain assumptions and theories.


2018 ◽  
Vol 7 (4.30) ◽  
pp. 458
Author(s):  
Srimazzura Basri ◽  
Mustafa Mamat ◽  
Puspa Liza Ghazali

Non-linear conjugate gradient methods has been widely used instrumental in solving large scale optimization. These methods has been proved that only required very low memory other than its numerical efficiency. Thus, many studies have been conducted to improve these methods to find the most efficient method. In this paper, we proposed a new non-linear conjugate gradient coefficient that guarantees sufficient descent condition. Numerical tests indicate that the proposed coefficient is better than the three classical conjugate gradient coefficients.


2018 ◽  
Vol 13 (03) ◽  
pp. 2050059
Author(s):  
Amina Boumediene ◽  
Rachid Benzine ◽  
Mohammed Belloufi

Nonlinear conjugate gradient (CG) methods are widely used for solving large scale unconstrained optimization problems. Many studies have been devoted to develop and improve these methods. In this paper, we aim to study the global convergence of the BBB conjugate gradient method with exact line search.


2017 ◽  
Vol 2017 ◽  
pp. 1-6 ◽  
Author(s):  
Ahmad Alhawarat ◽  
Zabidin Salleh

Conjugate gradient (CG) method is used to find the optimum solution for the large scale unconstrained optimization problems. Based on its simple algorithm, low memory requirement, and the speed of obtaining the solution, this method is widely used in many fields, such as engineering, computer science, and medical science. In this paper, we modified CG method to achieve the global convergence with various line searches. In addition, it passes the sufficient descent condition without any line search. The numerical computations under weak Wolfe-Powell line search shows that the efficiency of the new method is superior to other conventional methods.


Author(s):  
Chergui Ahmed ◽  
Bouali Tahar

In this paper, We propose a new nonlinear conjugate gradient method (FRA) that satisfies a sufficient descent condition and global convergence under the inexact line search of strong wolf powell. Our numerical experiment shaw the efficiency of the new method in solving a set of problems from the CUTEst package, the proposed new formula gives excellent numerical results at CPU time, number of iterations, number of gradient ratings when compared to WYL, DY, PRP, and FR methods.


Sign in / Sign up

Export Citation Format

Share Document