scholarly journals Global convergence of new three terms conjugate gradient for unconstrained optimization

2021 ◽  
Vol 11 (1) ◽  
pp. 1-9
Author(s):  
Ahmed Anwer Mustafa ◽  
Salah Gazi Shareef

In this paper, a new formula of 𝛽𝑘 is suggested for the conjugate gradient method of solving unconstrained optimization problems based on three terms and step size of cubic. Our new proposed CG method has descent condition, sufficient descent condition, conjugacy condition, and global convergence properties. Numerical comparisons with two standard conjugate gradient algorithms show that this algorithm is very effective depending on the number of iterations and the number of functions evaluated.

2020 ◽  
Vol 9 (2) ◽  
pp. 101-105
Author(s):  
Hussein Ageel Khatab ◽  
Salah Gazi Shareef

In this paper, we propose a new conjugate gradient method for solving nonlinear unconstrained optimization. The new method consists of three parts, the first part of them is the parameter of Hestenes-Stiefel (HS). The proposed method is satisfying the descent condition, sufficient descent condition and conjugacy condition. We give some numerical results to show the efficiency of the suggested method.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Zhongbo Sun ◽  
Yantao Tian ◽  
Hongyang Li

Two modified three-term type conjugate gradient algorithms which satisfy both the descent condition and the Dai-Liao type conjugacy condition are presented for unconstrained optimization. The first algorithm is a modification of the Hager and Zhang type algorithm in such a way that the search direction is descent and satisfies Dai-Liao’s type conjugacy condition. The second simple three-term type conjugate gradient method can generate sufficient decent directions at every iteration; moreover, this property is independent of the steplength line search. Also, the algorithms could be considered as a modification of the MBFGS method, but with differentzk. Under some mild conditions, the given methods are global convergence, which is independent of the Wolfe line search for general functions. The numerical experiments show that the proposed methods are very robust and efficient.


2013 ◽  
Vol 11 (5) ◽  
pp. 2586-2600
Author(s):  
Gonglin Yuan ◽  
Yong Li

At present, the conjugate gradient (CG) method of Hager and Zhang (Hager and Zhang, SIAM Journal on Optimization, 16(2005)) is regarded as one of the most effective CG methods for optimization problems. In order to further study the CG method, we develop the Hager and Zhang's CG method and present two modified CG formulas, where the given formulas possess the value information of not only the gradient but also the function. Moreover, the sufficient descent condition will be holden without any line search. The global convergence is established for nonconvex function under suitable conditions. Numerical results show that the proposed methods are competitive to the normal conjugate gradient method.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Shashi Kant Mishra ◽  
Suvra Kanti Chakraborty ◽  
Mohammad Esmael Samei ◽  
Bhagwat Ram

AbstractA Polak–Ribière–Polyak (PRP) algorithm is one of the oldest and popular conjugate gradient algorithms for solving nonlinear unconstrained optimization problems. In this paper, we present a q-variant of the PRP (q-PRP) method for which both the sufficient and conjugacy conditions are satisfied at every iteration. The proposed method is convergent globally with standard Wolfe conditions and strong Wolfe conditions. The numerical results show that the proposed method is promising for a set of given test problems with different starting points. Moreover, the method reduces to the classical PRP method as the parameter q approaches 1.


Author(s):  
Aseel M. Qasim ◽  
Zinah F. Salih ◽  
Basim A. Hassan

The primarily objective of this paper which is indicated in the field of conjugate gradient algorithms for unconstrained optimization problems and algorithms is to show the advantage of the new proposed algorithm in comparison with the standard method which is denoted as. Hestenes Stiefel method, as we know the coefficient conjugate parameter is very crucial for this reason, we proposed a simple modification of the coefficient conjugate gradient which is used to derived the new formula for the conjugate gradient update parameter described in this paper. Our new modification is based on the conjugacy situation for nonlinear conjugate gradient methods which is given by the conjugacy condition for nonlinear conjugate gradient methods and added a nonnegative parameter to suggest the new extension of the method. Under mild Wolfe conditions, the global convergence theorem and lemmas are also defined and proved. The proposed method's efficiency is programming and demonstrated by the numerical instances, which were very encouraging.


Author(s):  
Alaa Saad Ahmed ◽  
Hisham M. Khudhur ◽  
Mohammed S. Najmuldeen

<span>In this study, we develop a different parameter of three term conjugate gradient kind, this scheme depends principally on pure conjugacy condition (PCC), Whereas, the conjugacy condition (PCC) is an important condition in unconstrained non-linear optimization in general and in conjugate gradient methods in particular. The proposed method becomes converged, and satisfy conditions descent property by assuming some hypothesis, The numerical results display the effectiveness of the new method for solving test unconstrained non-linear optimization problems compared to other conjugate gradient algorithms such as Fletcher and Revees (FR) algorithm and three term Fletcher and Revees (TTFR) algorithm. and as shown in Table (1) from where in a number of iterations and evaluation of function and in Figures (1), (2) and (3) from where in A comparison of the number of iterations, A comparison of the number of times a function is calculated and A comparison of the time taken to perform the functions.</span>


2019 ◽  
Vol 2019 (1) ◽  
Author(s):  
Yuting Chen ◽  
Mingyuan Cao ◽  
Yueting Yang

AbstractIn this paper, we present a new conjugate gradient method using an acceleration scheme for solving large-scale unconstrained optimization. The generated search direction satisfies both the sufficient descent condition and the Dai–Liao conjugacy condition independent of line search. Moreover, the value of the parameter contains more useful information without adding more computational cost and storage requirements, which can improve the numerical performance. Under proper assumptions, the global convergence result of the proposed method with a Wolfe line search is established. Numerical experiments show that the given method is competitive for unconstrained optimization problems, with a maximum dimension of 100,000.


2016 ◽  
Vol 21 (3) ◽  
pp. 399-411 ◽  
Author(s):  
XiaoLiang Dong ◽  
HongWei Liu ◽  
YuBo He ◽  
Saman Babaie-Kafaki ◽  
Reza Ghanbari

In this paper, we propose a three–term PRP–type conjugate gradient method which always satisfies the sufficient descent condition independently of line searches employed. An important property of our method is that its direction is closest to the direction of the Newton method or satisfies conjugacy condition as the iterations evolve. In addition, under mild condition, we prove global convergence properties of the proposed method. Numerical comparison illustrates that our proposed method is efficient for solving the optimization problems.


2021 ◽  
Vol 2 (1) ◽  
pp. 33
Author(s):  
Nasiru Salihu ◽  
Mathew Remilekun Odekunle ◽  
Also Mohammed Saleh ◽  
Suraj Salihu

Some problems have no analytical solution or too difficult to solve by scientists, engineers, and mathematicians, so the development of numerical methods to obtain approximate solutions became necessary. Gradient methods are more efficient when the function to be minimized continuously in its first derivative. Therefore, this article presents a new hybrid Conjugate Gradient (CG) method to solve unconstrained optimization problems. The method requires the first-order derivatives but overcomes the steepest descent method’s shortcoming of slow convergence and needs not to save or compute the second-order derivatives needed by the Newton method. The CG update parameter is suggested from the Dai-Liao conjugacy condition as a convex combination of Hestenes-Stiefel and Fletcher-Revees algorithms by employing an optimal modulating choice parameterto avoid matrix storage. Numerical computation adopts an inexact line search to obtain the step-size that generates a decent property, showing that the algorithm is robust and efficient. The scheme converges globally under Wolfe line search, and it’s like is suitable in compressive sensing problems and M-tensor systems.


Sign in / Sign up

Export Citation Format

Share Document