scholarly journals New multi-step three-term conjugate gradient algorithms with inexact line searches

Author(s):  
Abbas Younis Al-Bayati ◽  
Muna M. M. Ali

<p>This work suggests several multi-step three-term Conjugate Gradient (CG)-algorithms that satisfies their sufficient descent property and conjugacy conditions. First, we have  considered a number of well-known three-term CG-method, and we have, therefore, suggested two new classes of this type of algorithms which was based on Hestenes and Stiefel (HS) and Polak-Ribière (PR) formulas with four different versions. Both descent and conjugacy conditions for all the proposed algorithms are satisfied, at each iteration by using the strong Wolfe line search condition and it's accelerated version. These new suggested algorithms are some sort of modifications to the original  HS and PR  methods. These CG-algorithms are considered as a sort of the  memoryless BFGS update.  All of our new suggested methods are proved to be a  global convergent and numerically, more efficient than the similar methods in same area based on our selected set of used numerical problems.</p>

2018 ◽  
Vol 7 (3.28) ◽  
pp. 12
Author(s):  
Wan Khadijah ◽  
Mohd Rivaie ◽  
Mustafa Mamat ◽  
Nurul Hajar ◽  
Nurul ‘Aini ◽  
...  

The conjugate gradient (CG) method is one of the most prominent methods for solving linear and nonlinear problems in optimization. In this paper, we propose a CG method with sufficient descent property under strong Wolfe line search. The proposed CG method is then applied to solve systems of linear equations. The numerical results obtained from the tests are evaluated based on number iteration and CPU time and then analyzed through performance profile. In order to examine its efficiency, the performance of our CG formula is compared to that of other CG methods. The results show that the proposed CG formula has better performance than the other tested CG methods.  


Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Meixing Liu ◽  
Guodong Ma ◽  
Jianghua Yin

The conjugate gradient method is very effective in solving large-scale unconstrained optimal problems. In this paper, on the basis of the conjugate parameter of the conjugate descent (CD) method and the second inequality in the strong Wolfe line search, two new conjugate parameters are devised. Using the strong Wolfe line search to obtain the step lengths, two modified conjugate gradient methods are proposed for general unconstrained optimization. Under the standard assumptions, the two presented methods are proved to be sufficient descent and globally convergent. Finally, preliminary numerical results are reported to show that the proposed methods are promising.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Zhongbo Sun ◽  
Yantao Tian ◽  
Hongyang Li

Two modified three-term type conjugate gradient algorithms which satisfy both the descent condition and the Dai-Liao type conjugacy condition are presented for unconstrained optimization. The first algorithm is a modification of the Hager and Zhang type algorithm in such a way that the search direction is descent and satisfies Dai-Liao’s type conjugacy condition. The second simple three-term type conjugate gradient method can generate sufficient decent directions at every iteration; moreover, this property is independent of the steplength line search. Also, the algorithms could be considered as a modification of the MBFGS method, but with differentzk. Under some mild conditions, the given methods are global convergence, which is independent of the Wolfe line search for general functions. The numerical experiments show that the proposed methods are very robust and efficient.


2014 ◽  
Vol 989-994 ◽  
pp. 1802-1805
Author(s):  
Hong Fang Cui

On the basis of the conjugate gradient method CD, the artile builds a new two-parameter P-NCD projected conjugate gradient method, the article gives two-parameter P-NCD Conjugate Gradient Method drop projection and on the strong Wolfe line search in the principles of the convergence criteria, the new algorithm is applied to estimate the equation with linear constraints in the model instantiated test ,the results shows good results.


2015 ◽  
Vol 9 ◽  
pp. 3105-3117 ◽  
Author(s):  
Norhaslinda Zull ◽  
Mohd Rivaie ◽  
Mustafa Mamat ◽  
Zabidin Salleh ◽  
Zahrahtul Amani

Author(s):  
Pro Kaelo ◽  
Sindhu Narayanan ◽  
M.V. Thuto

This article presents a modified quadratic hybridization of the Polak–Ribiere–Polyak and Fletcher–Reeves conjugate gradient method for solving unconstrained optimization problems. Global convergence, with the strong Wolfe line search conditions, of the proposed quadratic hybrid conjugate gradient method is established. We also report some numerical results to show the competitiveness of the new hybrid method.


2019 ◽  
Vol 13 (04) ◽  
pp. 2050081
Author(s):  
Badreddine Sellami ◽  
Mohamed Chiheb Eddine Sellami

In this paper, we are concerned with the conjugate gradient methods for solving unconstrained optimization problems. we propose a modified Fletcher–Reeves (abbreviated FR) [Function minimization by conjugate gradients, Comput. J. 7 (1964) 149–154] conjugate gradient algorithm satisfying a parametrized sufficient descent condition with a parameter [Formula: see text] is proposed. The parameter [Formula: see text] is computed by means of the conjugacy condition, thus an algorithm which is a positive multiplicative modification of the Hestenes and Stiefel (abbreviated HS) [Methods of conjugate gradients for solving linear systems, J. Res. Nat. Bur. Standards Sec. B 48 (1952) 409–436] algorithm is obtained, which produces a descent search direction at every iteration that the line search satisfies the Wolfe conditions. Under appropriate conditions, we show that the modified FR method with the strong Wolfe line search is globally convergent of uniformly convex functions. We also present extensive preliminary numerical experiments to show the efficiency of the proposed method.


Sign in / Sign up

Export Citation Format

Share Document