scholarly journals New Conjugate Gradient Method Addressing Large Scale Unconstrained Optimization Problem

An iterative conjugate gradient (CG) method is prominently known for dealing with unconstrained optimization problem. A new CG method which is modified by Wei Yao Liu (WYL) method is tested by standard test functions. Moreover, the step size is calculated using exact line search. Theoretical proofs on convergence analysis are shown. As a result, this new CG is comparable to the other methods in finding the optimal points by measuring the total iterations required as well as the computing time. Numerical results showed the execution between three CG methods in details.

2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Ahmad Alhawarat ◽  
Thoi Trung Nguyen ◽  
Ramadan Sabra ◽  
Zabidin Salleh

To find a solution of unconstrained optimization problems, we normally use a conjugate gradient (CG) method since it does not cost memory or storage of second derivative like Newton’s method or Broyden–Fletcher–Goldfarb–Shanno (BFGS) method. Recently, a new modification of Polak and Ribiere method was proposed with new restart condition to give a so-call AZPRP method. In this paper, we propose a new modification of AZPRP CG method to solve large-scale unconstrained optimization problems based on a modification of restart condition. The new parameter satisfies the descent property and the global convergence analysis with the strong Wolfe-Powell line search. The numerical results prove that the new CG method is strongly aggressive compared with CG_Descent method. The comparisons are made under a set of more than 140 standard functions from the CUTEst library. The comparison includes number of iterations and CPU time.


Filomat ◽  
2018 ◽  
Vol 32 (6) ◽  
pp. 2173-2191
Author(s):  
Hamid Esmaeili ◽  
Majid Rostami ◽  
Morteza Kimiaei

We present a new spectral conjugate gradient method based on the Dai-Yuan strategy to solve large-scale unconstrained optimization problems with applications to compressive sensing. In our method, the numerator of conjugate gradient parameter is a convex combination from the maximum gradient norm value in some preceding iterates and the current gradient norm value. This combination will try to produce the larger step-size far away from the optimizer and the smaller step-size close to it. In addition, the spectral parameter guarantees the descent property of the new generated direction in each iterate. The global convergence results are established under some standard assumptions. Numerical results are reported which indicate the promising behavior of the new procedure to solve large-scale unconstrained optimization and compressive sensing problems.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 92
Author(s):  
Talat Alkouli ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Puspa Liza Ghazali

In this paper, an efficient modification of nonlinear conjugate gradient method and an associated implementation, based on an exact line search, are proposed and analyzed to solve large-scale unconstrained optimization problems. The method satisfies the sufficient descent property. Furthermore, global convergence result is proved. Computational results for a set of unconstrained optimization test problems, some of them from CUTE library, showed that this new conjugate gradient algorithm seems to converge more stable and outperforms the other similar methods in many situations.   


2018 ◽  
Vol 7 (3.28) ◽  
pp. 84 ◽  
Author(s):  
Nurul Aini ◽  
Nurul Hajar ◽  
Mohd Rivaie ◽  
Mustafa Mamat

The conjugate gradient (CG) method is a well-known solver for large-scale unconstrained optimization problems. In this paper, a modified CG method based on AMR* and CD method is presented. The resulting algorithm for the new CG method is proved to be globally convergent under exact line search both under some mild conditions. Comparisons of numerical performance are made involving the new method and four other CG methods. The results show that the proposed method is more efficient.  


2014 ◽  
Vol 2014 ◽  
pp. 1-14
Author(s):  
San-Yang Liu ◽  
Yuan-Yuan Huang

This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent conditiongkTdk≤-1-1/4θkgk2  θk>1/4and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization.


Sign in / Sign up

Export Citation Format

Share Document