strong wolfe line search
Recently Published Documents


TOTAL DOCUMENTS

29
(FIVE YEARS 15)

H-INDEX

3
(FIVE YEARS 1)

2022 ◽  
Vol 20 ◽  
pp. 736-744
Author(s):  
Olawale J. Adeleke ◽  
Idowu A. Osinuga ◽  
Raufu A. Raji

In this paper, a new conjugate gradient (CG) parameter is proposed through the convex combination of the Fletcher-Reeves (FR) and Polak-Ribiére-Polyak (PRP) CG update parameters such that the conjugacy condition of Dai-Liao is satisfied. The computational efficiency of the PRP method and the convergence profile of the FR method motivated the choice of these two CG methods. The corresponding CG algorithm satisfies the sufficient descent property and was shown to be globally convergent under the strong Wolfe line search procedure. Numerical tests on selected benchmark test functions show that the algorithm is efficient and very competitive in comparison with some existing classical methods.


Author(s):  
Yutao Zheng

In this paper, a new family of Dai-Liao--type conjugate gradient methods are proposed for unconstrained optimization problem. In the new methods, the modified secant equation used in [H. Yabe and M. Takano, Comput. Optim. Appl., 28: 203--225, 2004] is considered in Dai and Liao's conjugacy condition. Under some certain assumptions, we show that our methods are globally convergent for general functions with strong Wolfe line search. Numerical results illustrate that our proposed methods can outperform some existing ones.


Author(s):  
Chenna Nasreddine ◽  
Sellami Badreddine ◽  
Belloufi Mohammed

In this paper, we present a new hybrid method to solve a nonlinear unconstrained optimization problem by using conjugate gradient, which is a convex combination of Liu–Storey (LS) conjugate gradient method and Hager–Zhang (HZ) conjugate gradient method. This method possesses the sufficient descent property with Strong Wolfe line search and the global convergence with the strong Wolfe line search. In the end of this paper, we illustrate our method by giving some numerical examples.


Author(s):  
Ladan Arman ◽  
Yuanming Xu ◽  
Long Liping

Abstract In this paper, based on the efficient Conjugate Descent (CD) method, two generalized CD algorithms are proposed to solve the unconstrained optimization problems. These methods are three-term conjugate gradient methods which the generated directions by using the conjugate gradient parameters and independent of the line search satisfy in the sufficient descent condition. Furthermore, under the strong Wolfe line search, the global convergence of the proposed methods are proved. Also, the preliminary numerical results on the CUTEst collection are presented to show effectiveness of our methods.


2021 ◽  
Author(s):  
Nur Athira Japri ◽  
Srimazzura Basri ◽  
Mustafa Mamat

Author(s):  
Abbas Younis Al-Bayati ◽  
Muna M. M. Ali

<p>This work suggests several multi-step three-term Conjugate Gradient (CG)-algorithms that satisfies their sufficient descent property and conjugacy conditions. First, we have  considered a number of well-known three-term CG-method, and we have, therefore, suggested two new classes of this type of algorithms which was based on Hestenes and Stiefel (HS) and Polak-Ribière (PR) formulas with four different versions. Both descent and conjugacy conditions for all the proposed algorithms are satisfied, at each iteration by using the strong Wolfe line search condition and it's accelerated version. These new suggested algorithms are some sort of modifications to the original  HS and PR  methods. These CG-algorithms are considered as a sort of the  memoryless BFGS update.  All of our new suggested methods are proved to be a  global convergent and numerically, more efficient than the similar methods in same area based on our selected set of used numerical problems.</p>


Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Meixing Liu ◽  
Guodong Ma ◽  
Jianghua Yin

The conjugate gradient method is very effective in solving large-scale unconstrained optimal problems. In this paper, on the basis of the conjugate parameter of the conjugate descent (CD) method and the second inequality in the strong Wolfe line search, two new conjugate parameters are devised. Using the strong Wolfe line search to obtain the step lengths, two modified conjugate gradient methods are proposed for general unconstrained optimization. Under the standard assumptions, the two presented methods are proved to be sufficient descent and globally convergent. Finally, preliminary numerical results are reported to show that the proposed methods are promising.


Sign in / Sign up

Export Citation Format

Share Document