A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints

CALCOLO ◽  
2015 ◽  
Vol 53 (2) ◽  
pp. 133-145 ◽  
Author(s):  
X. Y. Wang ◽  
S. J. Li ◽  
X. P. Kou
2013 ◽  
Vol 2013 ◽  
pp. 1-7 ◽  
Author(s):  
Sheng Wang ◽  
Hongbo Guan

Based on the Scaled conjugate gradient (SCALCG) method presented by Andrei (2007) and the projection method presented by Solodov and Svaiter, we propose a SCALCG method for solving monotone nonlinear equations with convex constraints. SCALCG method can be regarded as a combination of conjugate gradient method and Newton-type method for solving unconstrained optimization problems. So, it has the advantages of the both methods. It is suitable for solving large-scale problems. So, it can be applied to solving large-scale monotone nonlinear equations with convex constraints. Under reasonable conditions, we prove its global convergence. We also do some numerical experiments show that the proposed method is efficient and promising.


2021 ◽  
Vol 40 (3) ◽  
pp. 64-75
Author(s):  
Kanikar Muangchoo

In this paper, by combining the Solodov and Svaiter projection technique with the conjugate gradient method for unconstrained optimization proposed by Mohamed et al. (2020), we develop a derivative-free conjugate gradient method to solve nonlinear equations with convex constraints. The proposed method involves a spectral parameter which satisfies the sufficient descent condition. The global convergence is proved under the assumption that the underlying mapping is Lipschitz continuous and satisfies a weaker monotonicity condition. Numerical experiment shows that the proposed method is efficient.


Author(s):  
Mezher M. Abed ◽  
Ufuk Öztürk ◽  
Hisham M. Khudhur

The nonlinear conjugate gradient method is an effective technique for solving large-scale minimizations problems, and has a wide range of applications in various fields, such as mathematics, chemistry, physics, engineering and medicine. This study presents a novel spectral conjugate gradient algorithm (non-linear conjugate gradient algorithm), which is derived based on the Hisham–Khalil (KH) and Newton algorithms. Based on pure conjugacy condition The importance of this research lies in finding an appropriate method to solve all types of linear and non-linear fuzzy equations because the Buckley and Qu method is ineffective in solving fuzzy equations. Moreover, the conjugate gradient method does not need a Hessian matrix (second partial derivatives of functions) in the solution. The descent property of the proposed method is shown provided that the step size at meets the strong Wolfe conditions. In numerous circumstances, numerical results demonstrate that the proposed technique is more efficient than the Fletcher–Reeves and KH algorithms in solving fuzzy nonlinear equations.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Abubakar Sani Halilu ◽  
Arunava Majumder ◽  
Mohammed Yusuf Waziri ◽  
Kabiru Ahmed ◽  
Aliyu Muhammed Awwal

PurposeThe purpose of this research is to propose a new choice of nonnegative parameter t in Dai–Liao conjugate gradient method.Design/methodology/approachConjugate gradient algorithms are used to solve both constrained monotone and general systems of nonlinear equations. This is made possible by combining the conjugate gradient method with the Newton method approach via acceleration parameter in order to present a derivative-free method.FindingsA conjugate gradient method is presented by proposing a new Dai–Liao nonnegative parameter. Furthermore the proposed method is successfully applied to handle the application in motion control of the two joint planar robotic manipulators.Originality/valueThe proposed algorithm is a new approach that will not either submitted or publish somewhere.


2018 ◽  
Vol 103 (12) ◽  
pp. 1961-1974
Author(s):  
Usman Abbas Yakubu ◽  
Mustafa Mamat ◽  
Mohamad Afendee Mohamad ◽  
Mohd Rivaie ◽  
Jamilu Sabi’u

Sign in / Sign up

Export Citation Format

Share Document