scholarly journals Local Convergence Analysis of Augmented Lagrangian Methods for Piecewise Linear-Quadratic Composite Optimization Problems

2021 ◽  
Vol 31 (4) ◽  
pp. 2665-2694
Author(s):  
Nguyen T. V. Hang ◽  
M. Ebrahim Sarabi
2020 ◽  
Vol 45 (3) ◽  
pp. 1164-1192
Author(s):  
James V. Burke ◽  
Abraham Engle

This work concerns the local convergence theory of Newton and quasi-Newton methods for convex-composite optimization: where one minimizes an objective that can be written as the composition of a convex function with one that is continuiously differentiable. We focus on the case in which the convex function is a potentially infinite-valued piecewise linear-quadratic function. Such problems include nonlinear programming, mini-max optimization, and estimation of nonlinear dynamics with non-Gaussian noise as well as many modern approaches to large-scale data analysis and machine learning. Our approach embeds the optimality conditions for convex-composite optimization problems into a generalized equation. We establish conditions for strong metric subregularity and strong metric regularity of the corresponding set-valued mappings. This allows us to extend classical convergence of Newton and quasi-Newton methods to the broader class of nonfinite valued piecewise linear-quadratic convex-composite optimization problems. In particular, we establish local quadratic convergence of the Newton method under conditions that parallel those in nonlinear programming.


Author(s):  
J. E. Coster ◽  
N. Stander ◽  
J. A. Snyman

Abstract The problem of determining the optimal sizing design of truss structures is considered. An augmented Lagrangian optimization algorithm which uses a quadratic penalty term is formulated. The implementation uses a first-order Lagrange multiplier update and a strategy for progressively increasing the accuracy with which the bound constrained minimizations are performed. The allowed constraint violation is also progressively decreased but at a slower rate so as to prevent ill-conditioning due to large penalty values. Individual constraint penalties are used and only the penalties of the worst violated constraints are increased. The scheme is globally convergent. The bound constrained minimizations are performed using the SBMIN algorithm where a sophisticated trust-region strategy is employed. The Hessian of the augmented Lagrangian function is approximated using partitioned secant updating. Each function contributing to the Lagrangian is individually approximated by a secant update and the augmented Lagrangian Hessian is formed by appropriate accumulation. The performance of the algorithm is evaluated for a number of different secant updates on standard explicit and truss sizing optimization problems. The results show the formulation to be superior to other implementations of augmented Lagrangian methods reported in the literature and that, under certain conditions, the method approaches the performance of the state-of-the-art SQP and SAM methods. Of the secant updates, the symmetric rank one update is superior to the other updates including the BFGS scheme. It is suggested that the individual function, secant updating employed may be usefully applied in contexts where structural analysis and optimization are performed simultaneously, as in the simultaneous analysis and design method. In such cases the functions are partially separable and the associated Hessians are of low rank.


Mathematics ◽  
2019 ◽  
Vol 7 (9) ◽  
pp. 804
Author(s):  
Ioannis K. Argyros ◽  
Neha Gupta ◽  
J. P. Jaiswal

The semi-local convergence analysis of a well defined and efficient two-step Chord-type method in Banach spaces is presented in this study. The recurrence relation technique is used under some weak assumptions. The pertinency of the assumed method is extended for nonlinear non-differentiable operators. The convergence theorem is also established to show the existence and uniqueness of the approximate solution. A numerical illustration is quoted to certify the theoretical part which shows that earlier studies fail if the function is non-differentiable.


Sign in / Sign up

Export Citation Format

Share Document