scholarly journals A modifications of conjugate gradient method for unconstrained optimization problems

2018 ◽  
Vol 7 (2.14) ◽  
pp. 21
Author(s):  
Omar Alshorman ◽  
Mustafa Mamat ◽  
Ahmad Alhawarat ◽  
Mohd Revaie

The Conjugate Gradient (CG) methods play an important role in solving large-scale unconstrained optimization problems. Several studies have been recently devoted to improving and modifying these methods in relation to efficiency and robustness. In this paper, a new parameter of CG method has been proposed. The new parameter possesses global convergence properties under the Strong Wolfe-Powell (SWP) line search. The numerical results show that the proposed formula is more efficient and robust compared with Polak-Rribiere Ployak (PRP), Fletcher-Reeves (FR) and Wei, Yao, and Liu (WYL) parameters.  

2018 ◽  
Vol 7 (3.28) ◽  
pp. 92
Author(s):  
Talat Alkouli ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Puspa Liza Ghazali

In this paper, an efficient modification of nonlinear conjugate gradient method and an associated implementation, based on an exact line search, are proposed and analyzed to solve large-scale unconstrained optimization problems. The method satisfies the sufficient descent property. Furthermore, global convergence result is proved. Computational results for a set of unconstrained optimization test problems, some of them from CUTE library, showed that this new conjugate gradient algorithm seems to converge more stable and outperforms the other similar methods in many situations.   


Algorithms ◽  
2021 ◽  
Vol 14 (8) ◽  
pp. 227
Author(s):  
Zabidin Salleh ◽  
Ghaliah Alhamzi ◽  
Ibitsam Masmali ◽  
Ahmad Alhawarat

The conjugate gradient method is one of the most popular methods to solve large-scale unconstrained optimization problems since it does not require the second derivative, such as Newton’s method or approximations. Moreover, the conjugate gradient method can be applied in many fields such as neural networks, image restoration, etc. Many complicated methods are proposed to solve these optimization functions in two or three terms. In this paper, we propose a simple, easy, efficient, and robust conjugate gradient method. The new method is constructed based on the Liu and Storey method to overcome the convergence problem and descent property. The new modified method satisfies the convergence properties and the sufficient descent condition under some assumptions. The numerical results show that the new method outperforms famous CG methods such as CG-Descent5.3, Liu and Storey, and Dai and Liao. The numerical results include the number of iterations and CPU time.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Ahmad Alhawarat ◽  
Thoi Trung Nguyen ◽  
Ramadan Sabra ◽  
Zabidin Salleh

To find a solution of unconstrained optimization problems, we normally use a conjugate gradient (CG) method since it does not cost memory or storage of second derivative like Newton’s method or Broyden–Fletcher–Goldfarb–Shanno (BFGS) method. Recently, a new modification of Polak and Ribiere method was proposed with new restart condition to give a so-call AZPRP method. In this paper, we propose a new modification of AZPRP CG method to solve large-scale unconstrained optimization problems based on a modification of restart condition. The new parameter satisfies the descent property and the global convergence analysis with the strong Wolfe-Powell line search. The numerical results prove that the new CG method is strongly aggressive compared with CG_Descent method. The comparisons are made under a set of more than 140 standard functions from the CUTEst library. The comparison includes number of iterations and CPU time.


2018 ◽  
Vol 13 (03) ◽  
pp. 2050059
Author(s):  
Amina Boumediene ◽  
Rachid Benzine ◽  
Mohammed Belloufi

Nonlinear conjugate gradient (CG) methods are widely used for solving large scale unconstrained optimization problems. Many studies have been devoted to develop and improve these methods. In this paper, we aim to study the global convergence of the BBB conjugate gradient method with exact line search.


Author(s):  
Chergui Ahmed ◽  
Bouali Tahar

<span>The conjugate gradient method has played a special role in   solving large-scale unconstrained Optimization problems. In this paper, we propose a new family of CG coefficients that   possess    sufficient descent conditions and global convergence properties this CG method is similar to (Wei et al) [7].  Global convergence   result is established under Strong Wolf-Powell line search. Numerical results to find  the  optimum solution of some test  functions show the   new proposed formula has the best result in CPU time and the number of iterations, and the number of gradient evaluations when it comparing with FR, PRP, DY, and WYL </span>


2018 ◽  
Vol 7 (3.28) ◽  
pp. 84 ◽  
Author(s):  
Nurul Aini ◽  
Nurul Hajar ◽  
Mohd Rivaie ◽  
Mustafa Mamat

The conjugate gradient (CG) method is a well-known solver for large-scale unconstrained optimization problems. In this paper, a modified CG method based on AMR* and CD method is presented. The resulting algorithm for the new CG method is proved to be globally convergent under exact line search both under some mild conditions. Comparisons of numerical performance are made involving the new method and four other CG methods. The results show that the proposed method is more efficient.  


2021 ◽  
Vol 6 (10) ◽  
pp. 10742-10764
Author(s):  
Ibtisam A. Masmali ◽  
◽  
Zabidin Salleh ◽  
Ahmad Alhawarat ◽  
◽  
...  

<abstract> <p>The conjugate gradient (CG) method is a method to solve unconstrained optimization problems. Moreover CG method can be applied in medical science, industry, neural network, and many others. In this paper a new three term CG method is proposed. The new CG formula is constructed based on DL and WYL CG formulas to be non-negative and inherits the properties of HS formula. The new modification satisfies the convergence properties and the sufficient descent property. The numerical results show that the new modification is more efficient than DL, WYL, and CG-Descent formulas. We use more than 200 functions from CUTEst library to compare the results between these methods in term of number of iterations, function evaluations, gradient evaluations, and CPU time.</p> </abstract>


Author(s):  
Fanar N. Jardow ◽  
Ghada M. Al-Naemi

Many researchers are interested for developed and improved the conjugate gradient method for solving large scale unconstrained optimization problems. In this work a new parameter  will be presented as a convex combination between RMIL and MMWU. The suggestion method always produces a descent search direction at each iteration. Under Strong Wolfe Powell (SWP) line search conditions, the global convergence of the proposed method is established. The preliminary numerical comparisons with some others CG methods have shown that this new method is efficient and robust in solving all given problems.


Sign in / Sign up

Export Citation Format

Share Document