scholarly journals A new hybrid conjugate gradient algorithm for unconstrained optimization with inexact line search

Author(s):  
Fanar N. Jardow ◽  
Ghada M. Al-Naemi

Many researchers are interested for developed and improved the conjugate gradient method for solving large scale unconstrained optimization problems. In this work a new parameter  will be presented as a convex combination between RMIL and MMWU. The suggestion method always produces a descent search direction at each iteration. Under Strong Wolfe Powell (SWP) line search conditions, the global convergence of the proposed method is established. The preliminary numerical comparisons with some others CG methods have shown that this new method is efficient and robust in solving all given problems.

2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Ahmad Alhawarat ◽  
Thoi Trung Nguyen ◽  
Ramadan Sabra ◽  
Zabidin Salleh

To find a solution of unconstrained optimization problems, we normally use a conjugate gradient (CG) method since it does not cost memory or storage of second derivative like Newton’s method or Broyden–Fletcher–Goldfarb–Shanno (BFGS) method. Recently, a new modification of Polak and Ribiere method was proposed with new restart condition to give a so-call AZPRP method. In this paper, we propose a new modification of AZPRP CG method to solve large-scale unconstrained optimization problems based on a modification of restart condition. The new parameter satisfies the descent property and the global convergence analysis with the strong Wolfe-Powell line search. The numerical results prove that the new CG method is strongly aggressive compared with CG_Descent method. The comparisons are made under a set of more than 140 standard functions from the CUTEst library. The comparison includes number of iterations and CPU time.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 92
Author(s):  
Talat Alkouli ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Puspa Liza Ghazali

In this paper, an efficient modification of nonlinear conjugate gradient method and an associated implementation, based on an exact line search, are proposed and analyzed to solve large-scale unconstrained optimization problems. The method satisfies the sufficient descent property. Furthermore, global convergence result is proved. Computational results for a set of unconstrained optimization test problems, some of them from CUTE library, showed that this new conjugate gradient algorithm seems to converge more stable and outperforms the other similar methods in many situations.   


2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Xiangrong Li ◽  
Songhua Wang ◽  
Zhongzhou Jin ◽  
Hongtruong Pham

This paper gives a modified Hestenes and Stiefel (HS) conjugate gradient algorithm under the Yuan-Wei-Lu inexact line search technique for large-scale unconstrained optimization problems, where the proposed algorithm has the following properties: (1) the new search direction possesses not only a sufficient descent property but also a trust region feature; (2) the presented algorithm has global convergence for nonconvex functions; (3) the numerical experiment showed that the new algorithm is more effective than similar algorithms.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 84 ◽  
Author(s):  
Nurul Aini ◽  
Nurul Hajar ◽  
Mohd Rivaie ◽  
Mustafa Mamat

The conjugate gradient (CG) method is a well-known solver for large-scale unconstrained optimization problems. In this paper, a modified CG method based on AMR* and CD method is presented. The resulting algorithm for the new CG method is proved to be globally convergent under exact line search both under some mild conditions. Comparisons of numerical performance are made involving the new method and four other CG methods. The results show that the proposed method is more efficient.  


2018 ◽  
Vol 7 (2.14) ◽  
pp. 21
Author(s):  
Omar Alshorman ◽  
Mustafa Mamat ◽  
Ahmad Alhawarat ◽  
Mohd Revaie

The Conjugate Gradient (CG) methods play an important role in solving large-scale unconstrained optimization problems. Several studies have been recently devoted to improving and modifying these methods in relation to efficiency and robustness. In this paper, a new parameter of CG method has been proposed. The new parameter possesses global convergence properties under the Strong Wolfe-Powell (SWP) line search. The numerical results show that the proposed formula is more efficient and robust compared with Polak-Rribiere Ployak (PRP), Fletcher-Reeves (FR) and Wei, Yao, and Liu (WYL) parameters.  


Filomat ◽  
2018 ◽  
Vol 32 (6) ◽  
pp. 2173-2191
Author(s):  
Hamid Esmaeili ◽  
Majid Rostami ◽  
Morteza Kimiaei

We present a new spectral conjugate gradient method based on the Dai-Yuan strategy to solve large-scale unconstrained optimization problems with applications to compressive sensing. In our method, the numerator of conjugate gradient parameter is a convex combination from the maximum gradient norm value in some preceding iterates and the current gradient norm value. This combination will try to produce the larger step-size far away from the optimizer and the smaller step-size close to it. In addition, the spectral parameter guarantees the descent property of the new generated direction in each iterate. The global convergence results are established under some standard assumptions. Numerical results are reported which indicate the promising behavior of the new procedure to solve large-scale unconstrained optimization and compressive sensing problems.


Sign in / Sign up

Export Citation Format

Share Document