scholarly journals A Modified Three-Term Type CD Conjugate Gradient Algorithm for Unconstrained Optimization Problems

2020 ◽  
Vol 2020 ◽  
pp. 1-14
Author(s):  
Zhan Wang ◽  
Pengyuan Li ◽  
Xiangrong Li ◽  
Hongtruong Pham

Conjugate gradient methods are well-known methods which are widely applied in many practical fields. CD conjugate gradient method is one of the classical types. In this paper, a modified three-term type CD conjugate gradient algorithm is proposed. Some good features are presented as follows: (i) A modified three-term type CD conjugate gradient formula is presented. (ii) The given algorithm possesses sufficient descent property and trust region property. (iii) The algorithm has global convergence with the modified weak Wolfe–Powell (MWWP) line search technique and projection technique for general function. The new algorithm has made great progress in numerical experiments. It shows that the modified three-term type CD conjugate gradient method is more competitive than the classical CD conjugate gradient method.

2007 ◽  
Vol 2007 ◽  
pp. 1-19 ◽  
Author(s):  
Shang Shang ◽  
Jing Bai ◽  
Xiaolei Song ◽  
Hongkai Wang ◽  
Jaclyn Lau

Conjugate gradient method is verified to be efficient for nonlinear optimization problems of large-dimension data. In this paper, a penalized linear and nonlinear combined conjugate gradient method for the reconstruction of fluorescence molecular tomography (FMT) is presented. The algorithm combines the linear conjugate gradient method and the nonlinear conjugate gradient method together based on a restart strategy, in order to take advantage of the two kinds of conjugate gradient methods and compensate for the disadvantages. A quadratic penalty method is adopted to gain a nonnegative constraint and reduce the illposedness of the problem. Simulation studies show that the presented algorithm is accurate, stable, and fast. It has a better performance than the conventional conjugate gradient-based reconstruction algorithms. It offers an effective approach to reconstruct fluorochrome information for FMT.


Author(s):  
Gonglin Yuan ◽  
Tingting Li ◽  
Wujie Hu

Abstract To solve large-scale unconstrained optimization problems, a modified PRP conjugate gradient algorithm is proposed and is found to be interesting because it combines the steepest descent algorithm with the conjugate gradient method and successfully fully utilizes their excellent properties. For smooth functions, the objective algorithm sufficiently utilizes information about the gradient function and the previous direction to determine the next search direction. For nonsmooth functions, a Moreau–Yosida regularization is introduced into the proposed algorithm, which simplifies the process in addressing complex problems. The proposed algorithm has the following characteristics: (i) a sufficient descent feature as well as a trust region trait; (ii) the ability to achieve global convergence; (iii) numerical results for large-scale smooth/nonsmooth functions prove that the proposed algorithm is outstanding compared to other similar optimization methods; (iv) image restoration problems are done to turn out that the given algorithm is successful.


Author(s):  
Mezher M. Abed ◽  
Ufuk Öztürk ◽  
Hisham M. Khudhur

The nonlinear conjugate gradient method is an effective technique for solving large-scale minimizations problems, and has a wide range of applications in various fields, such as mathematics, chemistry, physics, engineering and medicine. This study presents a novel spectral conjugate gradient algorithm (non-linear conjugate gradient algorithm), which is derived based on the Hisham–Khalil (KH) and Newton algorithms. Based on pure conjugacy condition The importance of this research lies in finding an appropriate method to solve all types of linear and non-linear fuzzy equations because the Buckley and Qu method is ineffective in solving fuzzy equations. Moreover, the conjugate gradient method does not need a Hessian matrix (second partial derivatives of functions) in the solution. The descent property of the proposed method is shown provided that the step size at meets the strong Wolfe conditions. In numerous circumstances, numerical results demonstrate that the proposed technique is more efficient than the Fletcher–Reeves and KH algorithms in solving fuzzy nonlinear equations.


Filomat ◽  
2016 ◽  
Vol 30 (11) ◽  
pp. 3083-3100 ◽  
Author(s):  
Snezana Djordjevic

We consider a newhybrid conjugate gradient algorithm,which is obtained fromthe algorithmof Fletcher-Reeves, and the algorithmof Polak-Ribi?re-Polyak. Numerical comparisons show that the present hybrid conjugate gradient algorithm often behaves better than some known algorithms.


2006 ◽  
Vol 2006 ◽  
pp. 1-15 ◽  
Author(s):  
Mohamed Lamine Sahari ◽  
Ilhem Djellit

This work is an extension of the survey on Cayley's problem in case where the conjugate gradient method is used. We show that for certain values of parameters, this method produces beautiful fractal structures.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Shengwei Yao ◽  
Bin Qin

The conjugate gradient method is an efficient method for solving large-scale nonlinear optimization problems. In this paper, we propose a nonlinear conjugate gradient method which can be considered as a hybrid of DL and WYL conjugate gradient methods. The given method possesses the sufficient descent condition under the Wolfe-Powell line search and is globally convergent for general functions. Our numerical results show that the proposed method is very robust and efficient for the test problems.


2020 ◽  
Vol 2020 ◽  
pp. 1-9
Author(s):  
Junyue Cao ◽  
Jinzhao Wu ◽  
Wenjie Liu

It is well known that the nonlinear conjugate gradient algorithm is one of the effective algorithms for optimization problems since it has low storage and simple structure properties. This motivates us to make a further study to design a modified conjugate gradient formula for the optimization model, and this proposed conjugate gradient algorithm possesses several properties: (1) the search direction possesses not only the gradient value but also the function value; (2) the presented direction has both the sufficient descent property and the trust region feature; (3) the proposed algorithm has the global convergence for nonconvex functions; (4) the experiment is done for the image restoration problems and compression sensing to prove the performance of the new algorithm.


2018 ◽  
Vol 29 (1) ◽  
pp. 133
Author(s):  
Basim A. Hassan ◽  
Haneen A. Alashoor

The nonlinear conjugate gradient method is widely used to solve unconstrained optimization problems. In this paper the development of different versions of nonlinear conjugate gradient methods with global convergence properties proved. Numerical results indicated that the proposed method is very efficient.


Author(s):  
Samson Akinwale ◽  
O. O. Okundalaye

In a class of solving unconstrained optimization problems, the conjugate gradient method has been proved to be efficient by researchers' due to it's smaller storage requirements and computational cost. Then, a class of penalty algorithms based on three-term conjugate gradient methods was developed and extend to and solution of an unconstrained minimization portfolio management problems, where the objective function is a piecewise quadratic polynomial. By implementing the proposed algorithm to solve some selected unconstrained optimization problems, resulted in improvement in the total number of iterations and CPU time. It was shown that this algorithm is promising.


Sign in / Sign up

Export Citation Format

Share Document