A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function

Author(s):  
Zahra Khoshgam ◽  
Ali Ashrafi
2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Ahmad Alhawarat ◽  
Thoi Trung Nguyen ◽  
Ramadan Sabra ◽  
Zabidin Salleh

To find a solution of unconstrained optimization problems, we normally use a conjugate gradient (CG) method since it does not cost memory or storage of second derivative like Newton’s method or Broyden–Fletcher–Goldfarb–Shanno (BFGS) method. Recently, a new modification of Polak and Ribiere method was proposed with new restart condition to give a so-call AZPRP method. In this paper, we propose a new modification of AZPRP CG method to solve large-scale unconstrained optimization problems based on a modification of restart condition. The new parameter satisfies the descent property and the global convergence analysis with the strong Wolfe-Powell line search. The numerical results prove that the new CG method is strongly aggressive compared with CG_Descent method. The comparisons are made under a set of more than 140 standard functions from the CUTEst library. The comparison includes number of iterations and CPU time.


An iterative conjugate gradient (CG) method is prominently known for dealing with unconstrained optimization problem. A new CG method which is modified by Wei Yao Liu (WYL) method is tested by standard test functions. Moreover, the step size is calculated using exact line search. Theoretical proofs on convergence analysis are shown. As a result, this new CG is comparable to the other methods in finding the optimal points by measuring the total iterations required as well as the computing time. Numerical results showed the execution between three CG methods in details.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Minglei Fang ◽  
Min Wang ◽  
Min Sun ◽  
Rong Chen

The nonlinear conjugate gradient algorithms are a very effective way in solving large-scale unconstrained optimization problems. Based on some famous previous conjugate gradient methods, a modified hybrid conjugate gradient method was proposed. The proposed method can generate decent directions at every iteration independent of any line search. Under the Wolfe line search, the proposed method possesses global convergence. Numerical results show that the modified method is efficient and robust.


Algorithms ◽  
2021 ◽  
Vol 14 (8) ◽  
pp. 227
Author(s):  
Zabidin Salleh ◽  
Ghaliah Alhamzi ◽  
Ibitsam Masmali ◽  
Ahmad Alhawarat

The conjugate gradient method is one of the most popular methods to solve large-scale unconstrained optimization problems since it does not require the second derivative, such as Newton’s method or approximations. Moreover, the conjugate gradient method can be applied in many fields such as neural networks, image restoration, etc. Many complicated methods are proposed to solve these optimization functions in two or three terms. In this paper, we propose a simple, easy, efficient, and robust conjugate gradient method. The new method is constructed based on the Liu and Storey method to overcome the convergence problem and descent property. The new modified method satisfies the convergence properties and the sufficient descent condition under some assumptions. The numerical results show that the new method outperforms famous CG methods such as CG-Descent5.3, Liu and Storey, and Dai and Liao. The numerical results include the number of iterations and CPU time.


2020 ◽  
Vol 36 (1) ◽  
pp. 141-146
Author(s):  
SIMEON REICH ◽  
ALEXANDER J. ZASLAVSKI

"Given a Lipschitz and convex objective function of an unconstrained optimization problem, defined on a Banach space, we revisit the class of regular vector fields which was introduced in our previous work on descent methods. We study, in particular, the asymptotic behavior of the sequence of values of the objective function for a certain inexact process generated by a regular vector field when the sequence of computational errors converges to zero and show that this sequence of values converges to the infimum of the given objective function of the unconstrained optimization problem."


2018 ◽  
Vol 7 (3.28) ◽  
pp. 92
Author(s):  
Talat Alkouli ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Puspa Liza Ghazali

In this paper, an efficient modification of nonlinear conjugate gradient method and an associated implementation, based on an exact line search, are proposed and analyzed to solve large-scale unconstrained optimization problems. The method satisfies the sufficient descent property. Furthermore, global convergence result is proved. Computational results for a set of unconstrained optimization test problems, some of them from CUTE library, showed that this new conjugate gradient algorithm seems to converge more stable and outperforms the other similar methods in many situations.   


2013 ◽  
Vol 2013 ◽  
pp. 1-8
Author(s):  
Yuanying Qiu ◽  
Dandan Cui ◽  
Wei Xue ◽  
Gaohang Yu

This paper presents a hybrid spectral conjugate gradient method for large-scale unconstrained optimization, which possesses a self-adjusting property. Under the standard Wolfe conditions, its global convergence result is established. Preliminary numerical results are reported on a set of large-scale problems in CUTEr to show the convergence and efficiency of the proposed method.


Sign in / Sign up

Export Citation Format

Share Document