The global convergence of the Polak–Ribière–Polyak conjugate gradient algorithm under inexact line search for nonconvex functions

2019 ◽  
Vol 362 ◽  
pp. 262-275 ◽  
Author(s):  
Gonglin Yuan ◽  
Zengxin Wei ◽  
Yuning Yang
2019 ◽  
Vol 2019 ◽  
pp. 1-6
Author(s):  
Huda I. Ahmed ◽  
Rana Z. Al-Kawaz ◽  
Abbas Y. Al-Bayati

In this work, we tend to deal within the field of the constrained optimization methods of three-term Conjugate Gradient (CG) technique which is primarily based on Dai–Liao (DL) formula. The new proposed technique satisfies the conjugacy property and the descent conditions of Karush–Kuhn–Tucker (K.K.T.). Our planned constrained technique uses the robust Wolfe line search condition with some assumptions. We tend to prove the global convergence property of the new planned technique. Numeral comparisons for (30-thirty) constrained optimization issues make sure the effectiveness of the new planned formula.


2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Xiangrong Li ◽  
Songhua Wang ◽  
Zhongzhou Jin ◽  
Hongtruong Pham

This paper gives a modified Hestenes and Stiefel (HS) conjugate gradient algorithm under the Yuan-Wei-Lu inexact line search technique for large-scale unconstrained optimization problems, where the proposed algorithm has the following properties: (1) the new search direction possesses not only a sufficient descent property but also a trust region feature; (2) the presented algorithm has global convergence for nonconvex functions; (3) the numerical experiment showed that the new algorithm is more effective than similar algorithms.


2019 ◽  
Vol 24 (1) ◽  
pp. 115
Author(s):  
Hind H. Mohammed

In this paper, we will present different type of CG algorithms depending on Peary conjugacy condition. The new conjugate gradient training (GDY) algorithm using to train MFNNs and prove it's descent property and global convergence for it and then we tested the behavior of this algorithm in the training of artificial neural networks and compared it with known algorithms in this field through two types of issues   http://dx.doi.org/10.25130/tjps.24.2019.020


Author(s):  
Amira Hamdi ◽  
Badreddine Sellami ◽  
Mohammed Belloufi

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems, the conjugate gradient parameter [Formula: see text] is computed as a convex combination of [Formula: see text] and [Formula: see text]. Under the wolfe line search, we prove the sufficient descent and the global convergence. Numerical results are reported to show the effectiveness of our procedure.


2017 ◽  
Vol 2017 ◽  
pp. 1-12 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat ◽  
U. A. M. Roslan

A new modified three-term conjugate gradient (CG) method is shown for solving the large scale optimization problems. The idea relates to the famous Polak-Ribière-Polyak (PRP) formula. As the numerator of PRP plays a vital role in numerical result and not having the jamming issue, PRP method is not globally convergent. So, for the new three-term CG method, the idea is to use the PRP numerator and combine it with any good CG formula’s denominator that performs well. The new modification of three-term CG method possesses the sufficient descent condition independent of any line search. The novelty is that by using the Wolfe Powell line search the new modification possesses global convergence properties with convex and nonconvex functions. Numerical computation with the Wolfe Powell line search by using the standard test function of optimization shows the efficiency and robustness of the new modification.


2020 ◽  
Vol 2020 ◽  
pp. 1-9
Author(s):  
Junyue Cao ◽  
Jinzhao Wu ◽  
Wenjie Liu

It is well known that the nonlinear conjugate gradient algorithm is one of the effective algorithms for optimization problems since it has low storage and simple structure properties. This motivates us to make a further study to design a modified conjugate gradient formula for the optimization model, and this proposed conjugate gradient algorithm possesses several properties: (1) the search direction possesses not only the gradient value but also the function value; (2) the presented direction has both the sufficient descent property and the trust region feature; (3) the proposed algorithm has the global convergence for nonconvex functions; (4) the experiment is done for the image restoration problems and compression sensing to prove the performance of the new algorithm.


Sign in / Sign up

Export Citation Format

Share Document