scholarly journals Sufficient Descent Conjugate Gradient Methods for Solving Convex Constrained Nonlinear Monotone Equations

2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
San-Yang Liu ◽  
Yuan-Yuan Huang ◽  
Hong-Wei Jiao

Two unified frameworks of some sufficient descent conjugate gradient methods are considered. Combined with the hyperplane projection method of Solodov and Svaiter, they are extended to solve convex constrained nonlinear monotone equations. Their global convergence is proven under some mild conditions. Numerical results illustrate that these methods are efficient and can be applied to solve large-scale nonsmooth equations.

Author(s):  
Mompati Koorapetse ◽  
P Kaelo ◽  
S Kooepile-Reikeletseng

In this paper, a new modified Perry-type derivative-free projection method for solving large-scale nonlinear monotone equations is presented. The method is developed by combining a modified Perry's conjugate gradient method with the hyperplane projection technique. Global convergence and numerical results of the proposed method are established. Preliminary numerical results show that the proposed method is promising and efficient compared to some existing methods in the literature.


2013 ◽  
Vol 30 (01) ◽  
pp. 1250043
Author(s):  
LIANG YIN ◽  
XIONGDA CHEN

The conjugate gradient method is widely used in unconstrained optimization, especially for large-scale problems. Recently, Zhang et al. proposed a three-term PRP method (TTPRP) and a three-term HS method (TTHS), both of which can produce sufficient descent conditions. In this paper, the global convergence of the TTPRP and TTHS methods is studied, in which the line search procedure is replaced by a fixed formula of stepsize. This character is of significance when the line search is expensive in some particular applications. In addition, relevant computational results are also presented.


2019 ◽  
Vol 24 (4) ◽  
pp. 550-563
Author(s):  
Mompati Koorapetse ◽  
Professor Kaelo

A new three-term conjugate gradient-based projection method is presented in this paper for solving large-scale nonlinear monotone equations. This method is derivative-free and it is suitable for solving large-scale nonlinear monotone equations due to its lower storage requirements. The method satisfies the sufficient descent condition FTkdk ≤ −τ‖Fk‖2, where τ > 0 is a constant, and its global convergence is also established. Numerical results show that the method is efficient and promising.


2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat

This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees global convergence using the inexact Wolfe–Powell line search. The numerical efficiency of the modified three-term HS method is checked using 75 standard test functions. It is known that three-term conjugate gradient methods are numerically more efficient than two-term conjugate gradient methods. Importantly, this paper quantifies how much better the three-term performance is compared with two-term methods. Thus, in the numerical results, we compare our new modification with an efficient two-term conjugate gradient method. We also compare our modification with a state-of-the-art three-term HS method. Finally, we conclude that our proposed modification is globally convergent and numerically efficient.


Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Meixing Liu ◽  
Guodong Ma ◽  
Jianghua Yin

The conjugate gradient method is very effective in solving large-scale unconstrained optimal problems. In this paper, on the basis of the conjugate parameter of the conjugate descent (CD) method and the second inequality in the strong Wolfe line search, two new conjugate parameters are devised. Using the strong Wolfe line search to obtain the step lengths, two modified conjugate gradient methods are proposed for general unconstrained optimization. Under the standard assumptions, the two presented methods are proved to be sufficient descent and globally convergent. Finally, preliminary numerical results are reported to show that the proposed methods are promising.


Sign in / Sign up

Export Citation Format

Share Document