scholarly journals A New Augmented Lagrangian Method for Equality Constrained Optimization with Simple Unconstrained Subproblem

2017 ◽  
Vol 2017 ◽  
pp. 1-9
Author(s):  
Hao Zhang ◽  
Qin Ni

We propose a new method for equality constrained optimization based on augmented Lagrangian method. We construct an unconstrained subproblem by adding an adaptive quadratic term to the quadratic model of augmented Lagrangian function. In each iteration, we solve this unconstrained subproblem to obtain the trial step. The main feature of this work is that the subproblem can be more easily solved. Numerical results show that this method is effective.

2021 ◽  
Vol 3 (1) ◽  
pp. 89-117
Author(s):  
Yangyang Xu

First-order methods (FOMs) have been popularly used for solving large-scale problems. However, many existing works only consider unconstrained problems or those with simple constraint. In this paper, we develop two FOMs for constrained convex programs, where the constraint set is represented by affine equations and smooth nonlinear inequalities. Both methods are based on the classical augmented Lagrangian function. They update the multipliers in the same way as the augmented Lagrangian method (ALM) but use different primal updates. The first method, at each iteration, performs a single proximal gradient step to the primal variable, and the second method is a block update version of the first one. For the first method, we establish its global iterate convergence and global sublinear and local linear convergence, and for the second method, we show a global sublinear convergence result in expectation. Numerical experiments are carried out on the basis pursuit denoising, convex quadratically constrained quadratic programs, and the Neyman-Pearson classification problem to show the empirical performance of the proposed methods. Their numerical behaviors closely match the established theoretical results.


Sign in / Sign up

Export Citation Format

Share Document