wolfe line search
Recently Published Documents


TOTAL DOCUMENTS

52
(FIVE YEARS 26)

H-INDEX

4
(FIVE YEARS 2)

2022 ◽  
Vol 20 ◽  
pp. 736-744
Author(s):  
Olawale J. Adeleke ◽  
Idowu A. Osinuga ◽  
Raufu A. Raji

In this paper, a new conjugate gradient (CG) parameter is proposed through the convex combination of the Fletcher-Reeves (FR) and Polak-Ribiére-Polyak (PRP) CG update parameters such that the conjugacy condition of Dai-Liao is satisfied. The computational efficiency of the PRP method and the convergence profile of the FR method motivated the choice of these two CG methods. The corresponding CG algorithm satisfies the sufficient descent property and was shown to be globally convergent under the strong Wolfe line search procedure. Numerical tests on selected benchmark test functions show that the algorithm is efficient and very competitive in comparison with some existing classical methods.


Author(s):  
Yutao Zheng

In this paper, a new family of Dai-Liao--type conjugate gradient methods are proposed for unconstrained optimization problem. In the new methods, the modified secant equation used in [H. Yabe and M. Takano, Comput. Optim. Appl., 28: 203--225, 2004] is considered in Dai and Liao's conjugacy condition. Under some certain assumptions, we show that our methods are globally convergent for general functions with strong Wolfe line search. Numerical results illustrate that our proposed methods can outperform some existing ones.


Author(s):  
Chenna Nasreddine ◽  
Sellami Badreddine ◽  
Belloufi Mohammed

In this paper, we present a new hybrid method to solve a nonlinear unconstrained optimization problem by using conjugate gradient, which is a convex combination of Liu–Storey (LS) conjugate gradient method and Hager–Zhang (HZ) conjugate gradient method. This method possesses the sufficient descent property with Strong Wolfe line search and the global convergence with the strong Wolfe line search. In the end of this paper, we illustrate our method by giving some numerical examples.


Author(s):  
Ladan Arman ◽  
Yuanming Xu ◽  
Long Liping

Abstract In this paper, based on the efficient Conjugate Descent (CD) method, two generalized CD algorithms are proposed to solve the unconstrained optimization problems. These methods are three-term conjugate gradient methods which the generated directions by using the conjugate gradient parameters and independent of the line search satisfy in the sufficient descent condition. Furthermore, under the strong Wolfe line search, the global convergence of the proposed methods are proved. Also, the preliminary numerical results on the CUTEst collection are presented to show effectiveness of our methods.


Author(s):  
Amira Hamdi ◽  
Badreddine Sellami ◽  
Mohammed Belloufi

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems, the conjugate gradient parameter [Formula: see text] is computed as a convex combination of [Formula: see text] and [Formula: see text]. Under the wolfe line search, we prove the sufficient descent and the global convergence. Numerical results are reported to show the effectiveness of our procedure.


2021 ◽  
Vol 2 (1) ◽  
pp. 33
Author(s):  
Nasiru Salihu ◽  
Mathew Remilekun Odekunle ◽  
Also Mohammed Saleh ◽  
Suraj Salihu

Some problems have no analytical solution or too difficult to solve by scientists, engineers, and mathematicians, so the development of numerical methods to obtain approximate solutions became necessary. Gradient methods are more efficient when the function to be minimized continuously in its first derivative. Therefore, this article presents a new hybrid Conjugate Gradient (CG) method to solve unconstrained optimization problems. The method requires the first-order derivatives but overcomes the steepest descent method’s shortcoming of slow convergence and needs not to save or compute the second-order derivatives needed by the Newton method. The CG update parameter is suggested from the Dai-Liao conjugacy condition as a convex combination of Hestenes-Stiefel and Fletcher-Revees algorithms by employing an optimal modulating choice parameterto avoid matrix storage. Numerical computation adopts an inexact line search to obtain the step-size that generates a decent property, showing that the algorithm is robust and efficient. The scheme converges globally under Wolfe line search, and it’s like is suitable in compressive sensing problems and M-tensor systems.


2021 ◽  
Vol 2021 ◽  
pp. 1-7
Author(s):  
Shengwei Yao ◽  
Yuping Wu ◽  
Jielan Yang ◽  
Jieqiong Xu

We proposed a three-term gradient descent method that can be well applied to address the optimization problems in this article. The search direction of the obtained method is generated in a specific subspace. Specifically, a quadratic approximation model is applied in the process of generating the search direction. In order to reduce the amount of calculation and make the best use of the existing information, the subspace was made up of the gradient of the current and prior iteration point and the previous search direction. By using the subspace-based optimization technology, the global convergence result is established under Wolfe line search. The results of numerical experiments show that the new method is effective and robust.


2021 ◽  
Author(s):  
Nur Athira Japri ◽  
Srimazzura Basri ◽  
Mustafa Mamat

Sign in / Sign up

Export Citation Format

Share Document