An algorithm of sequential systems of linear equations for nonlinear optimization problems with arbitrary initial point

1997 ◽  
Vol 40 (6) ◽  
pp. 561-571 ◽  
Author(s):  
Ziyou Gao ◽  
Guoping He ◽  
Fang Wu
2014 ◽  
Vol 2014 ◽  
pp. 1-6
Author(s):  
Zhijun Luo ◽  
Lirong Wang

A new parallel variable distribution algorithm based on interior point SSLE algorithm is proposed for solving inequality constrained optimization problems under the condition that the constraints are block-separable by the technology of sequential system of linear equation. Each iteration of this algorithm only needs to solve three systems of linear equations with the same coefficient matrix to obtain the descent direction. Furthermore, under certain conditions, the global convergence is achieved.


2014 ◽  
Vol 26 (5) ◽  
pp. 566-572 ◽  
Author(s):  
Ailan Liu ◽  
◽  
Dingguo Pu ◽  
◽  

<div class=""abs_img""><img src=""[disp_template_path]/JRM/abst-image/00260005/04.jpg"" width=""300"" />Algorithm flow chart</div> We propose a nonmonotone QP-free infeasible method for inequality-constrained nonlinear optimization problems based on a 3-1 piecewise linear NCP function. This nonmonotone QP-free infeasible method is iterative and is based on nonsmooth reformulation of KKT first-order optimality conditions. It does not use a penalty function or a filter in nonmonotone line searches. This algorithm solves only two systems of linear equations with the same nonsingular coefficient matrix, and is implementable and globally convergent without a linear independence constraint qualification or a strict complementarity condition. Preliminary numerical results are presented. </span>


2011 ◽  
Vol 28 (03) ◽  
pp. 361-399 ◽  
Author(s):  
CHUNGEN SHEN ◽  
WENJUAN XUE ◽  
DINGGUO PU

In this paper, we propose a new sequential systems of linear equations (SSLE) filter algorithm, which is an infeasible QP-free method. The new algorithm needs to solve a few reduced systems of linear equations with the same nonsingular coefficient matrix, and after finitely many iterations, only two linear systems need to be solved. Furthermore, the nearly active set technique is used to improve the computational effect. Under the linear independence condition, the global convergence is proved. In particular, the rate of convergence is proved to be one-step superlinear without assuming the strict complementarity condition. Numerical results and comparison with other algorithms indicate that the new algorithm is promising.


Author(s):  
David Ek ◽  
Anders Forsgren

AbstractThe main focus in this paper is exact linesearch methods for minimizing a quadratic function whose Hessian is positive definite. We give a class of limited-memory quasi-Newton Hessian approximations which generate search directions parallel to those of the BFGS method, or equivalently, to those of the method of preconditioned conjugate gradients. In the setting of reduced Hessians, the class provides a dynamical framework for the construction of limited-memory quasi-Newton methods. These methods attain finite termination on quadratic optimization problems in exact arithmetic. We show performance of the methods within this framework in finite precision arithmetic by numerical simulations on sequences of related systems of linear equations, which originate from the CUTEst test collection. In addition, we give a compact representation of the Hessian approximations in the full Broyden class for the general unconstrained optimization problem. This representation consists of explicit matrices and gradients only as vector components.


Sign in / Sign up

Export Citation Format

Share Document