scholarly journals A Limited Memory BFGS Method for Solving Large-Scale Symmetric Nonlinear Equations

2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Xiangrong Li ◽  
Xiaoliang Wang ◽  
Xiabin Duan

A limited memory BFGS (L-BFGS) algorithm is presented for solving large-scale symmetric nonlinear equations, where a line search technique without derivative information is used. The global convergence of the proposed algorithm is established under some suitable conditions. Numerical results show that the given method is competitive to those of the normal BFGS methods.

2009 ◽  
Vol 2009 ◽  
pp. 1-22 ◽  
Author(s):  
Gonglin Yuan ◽  
Shide Meng ◽  
Zengxin Wei

A trust-region-based BFGS method is proposed for solving symmetric nonlinear equations. In this given algorithm, if the trial step is unsuccessful, the linesearch technique will be used instead of repeatedly solving the subproblem of the normal trust-region method. We establish the global and superlinear convergence of the method under suitable conditions. Numerical results show that the given method is competitive to the normal trust region method.


2014 ◽  
Vol 530-531 ◽  
pp. 367-371
Author(s):  
Ting Feng Li ◽  
Yu Ting Zhang ◽  
Sheng Hui Yan

In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses a global convergence property even without convexity assumption on the objective function. The implementations of the algorithm on CUTE test problems are reported, which suggest that a slight improvement has been achieved.


2012 ◽  
Vol 17 (2) ◽  
pp. 203-216 ◽  
Author(s):  
Gonglin Yuan ◽  
Zengxin Wei

The Barzilai and Borwein gradient algorithm has received a great deal of attention in recent decades since it is simple and effective for smooth optimization problems. Whether can it be extended to solve nonsmooth problems? In this paper, we answer this question positively. The Barzilai and Borwein gradient algorithm combined with a nonmonotone line search technique is proposed for nonsmooth convex minimization. The global convergence of the given algorithm is established under suitable conditions. Numerical results show that this method is efficient.


2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Xiangrong Li ◽  
Songhua Wang ◽  
Zhongzhou Jin ◽  
Hongtruong Pham

This paper gives a modified Hestenes and Stiefel (HS) conjugate gradient algorithm under the Yuan-Wei-Lu inexact line search technique for large-scale unconstrained optimization problems, where the proposed algorithm has the following properties: (1) the new search direction possesses not only a sufficient descent property but also a trust region feature; (2) the presented algorithm has global convergence for nonconvex functions; (3) the numerical experiment showed that the new algorithm is more effective than similar algorithms.


Sign in / Sign up

Export Citation Format

Share Document