scholarly journals An Alternative Modified Conjugate Gradient Coefficient for Solving Nonlinear System of Equations

2019 ◽  
Vol 2 (3) ◽  
pp. 5-8
Author(s):  
Muhammad Kabir Dauda ◽  
Shehu Usman ◽  
Hayatu Ubale ◽  
M Mamat

In mathematical term, the method of solving models and finding the best alternatives is known as optimization. Conjugate gradient (CG) method is an evolution of computational method in solving optimization problems. In this article, an alternative modified conjugate gradient coefficient for solving large-scale nonlinear system of equations is presented. The method is an improved version of the Rivaie et el conjugate gradient method for unconstrained optimization problems. The new CG is tested on a set of test functions under exact line search. The approach is easy to implement due to its derivative-free nature and has been proven to be effective in solving real-life application. Under some mild assumptions, the global convergence of the proposed method is established. The new CG coefficient also retains the sufficient descent condition. The performance of the new method is compared to the well-known previous PRP CG methods based on number of iterations and CPU time. Numerical results using some benchmark problems show that the proposed method is promising and has the best efficiency amongst all the methods tested.

Author(s):  
Amina Boumediene ◽  
Tahar Bechouat ◽  
Rachid Benzine ◽  
Ghania Hadji

The nonlinear Conjugate gradient method (CGM) is a very effective way in solving large-scale optimization problems. Zhang et al. proposed a new CG coefficient which is defined by [Formula: see text]. They proved the sufficient descent condition and the global convergence for nonconvex minimization in strong Wolfe line search. In this paper, we prove that this CG coefficient possesses sufficient descent conditions and global convergence properties under the exact line search.


2011 ◽  
Vol 18 (9) ◽  
pp. 1249-1253 ◽  
Author(s):  
Mehdi Dehghan ◽  
Masoud Hajarian

The conjugate gradient method is one of the most useful and the earliest-discovered techniques for solving large-scale nonlinear optimization problems. Many variants of this method have been proposed, and some are widely used in practice. In this article, we study the descent Dai–Yuan conjugate gradient method which guarantees the sufficient descent condition for any line search. With exact line search, the introduced conjugate gradient method reduces to the Dai–Yuan conjugate gradient method. Finally, a global convergence result is established when the line search fulfils the Goldstein conditions.


2009 ◽  
Vol 2009 ◽  
pp. 1-16 ◽  
Author(s):  
Jianguo Zhang ◽  
Yunhai Xiao ◽  
Zengxin Wei

Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed. An attractive property of the methods, is that, without any line search, the generated directions always descend. Under some mild conditions, global convergence results for both methods are established. Preliminary numerical results show that these proposed methods are promising, and competitive with the well-known PRP method.


2017 ◽  
Vol 2017 ◽  
pp. 1-12 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat ◽  
U. A. M. Roslan

A new modified three-term conjugate gradient (CG) method is shown for solving the large scale optimization problems. The idea relates to the famous Polak-Ribière-Polyak (PRP) formula. As the numerator of PRP plays a vital role in numerical result and not having the jamming issue, PRP method is not globally convergent. So, for the new three-term CG method, the idea is to use the PRP numerator and combine it with any good CG formula’s denominator that performs well. The new modification of three-term CG method possesses the sufficient descent condition independent of any line search. The novelty is that by using the Wolfe Powell line search the new modification possesses global convergence properties with convex and nonconvex functions. Numerical computation with the Wolfe Powell line search by using the standard test function of optimization shows the efficiency and robustness of the new modification.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 36
Author(s):  
Norrlaili Shapiee ◽  
Mohd Rivaie ◽  
Mustafa Mamat ◽  
Puspa Liza Ghazali

Conjugate gradient (CG) methods are famous for their utilization in solving unconstrained optimization problems, particularly for large scale problems and have become more intriguing such as in engineering field. In this paper, we propose a new family of CG coefficient and apply in regression analysis. The global convergence is established by using exact and inexact line search. Numerical results are presented based on the number of iterations and CPU time. The findings show that our method is more efficient in comparison to some of the previous CG methods for a given standard test problems and successfully solve the real life problem.  


2017 ◽  
Vol 2017 ◽  
pp. 1-6 ◽  
Author(s):  
Ahmad Alhawarat ◽  
Zabidin Salleh

Conjugate gradient (CG) method is used to find the optimum solution for the large scale unconstrained optimization problems. Based on its simple algorithm, low memory requirement, and the speed of obtaining the solution, this method is widely used in many fields, such as engineering, computer science, and medical science. In this paper, we modified CG method to achieve the global convergence with various line searches. In addition, it passes the sufficient descent condition without any line search. The numerical computations under weak Wolfe-Powell line search shows that the efficiency of the new method is superior to other conventional methods.


2015 ◽  
Vol 2015 ◽  
pp. 1-7
Author(s):  
Guanghui Zhou ◽  
Qin Ni

A new spectral conjugate gradient method (SDYCG) is presented for solving unconstrained optimization problems in this paper. Our method provides a new expression of spectral parameter. This formula ensures that the sufficient descent condition holds. The search direction in the SDYCG can be viewed as a combination of the spectral gradient and the Dai-Yuan conjugate gradient. The global convergence of the SDYCG is also obtained. Numerical results show that the SDYCG may be capable of solving large-scale nonlinear unconstrained optimization problems.


2018 ◽  
Vol 2018 ◽  
pp. 1-15 ◽  
Author(s):  
Gaoyi Wu ◽  
Yong Li ◽  
Gonglin Yuan

This paper further studies the WYL conjugate gradient (CG) formula with βkWYL≥0 and presents a three-term WYL CG algorithm, which has the sufficiently descent property without any conditions. The global convergence and the linear convergence are proved; moreover the n-step quadratic convergence with a restart strategy is established if the initial step length is appropriately chosen. Numerical experiments for large-scale problems including the normal unconstrained optimization problems and the engineer problems (Benchmark Problems) show that the new algorithm is competitive with the other similar CG algorithms.


2019 ◽  
Vol 2019 (1) ◽  
Author(s):  
Yuting Chen ◽  
Mingyuan Cao ◽  
Yueting Yang

AbstractIn this paper, we present a new conjugate gradient method using an acceleration scheme for solving large-scale unconstrained optimization. The generated search direction satisfies both the sufficient descent condition and the Dai–Liao conjugacy condition independent of line search. Moreover, the value of the parameter contains more useful information without adding more computational cost and storage requirements, which can improve the numerical performance. Under proper assumptions, the global convergence result of the proposed method with a Wolfe line search is established. Numerical experiments show that the given method is competitive for unconstrained optimization problems, with a maximum dimension of 100,000.


Sign in / Sign up

Export Citation Format

Share Document