Optimal proximal augmented Lagrangian method and its application to full Jacobian splitting for multi-block separable convex minimization problems

2019 ◽  
Vol 40 (2) ◽  
pp. 1188-1216 ◽  
Author(s):  
Bingsheng He ◽  
Feng Ma ◽  
Xiaoming Yuan

Abstract The augmented Lagrangian method (ALM) is fundamental in solving convex programming problems with linear constraints. The proximal version of ALM, which regularizes ALM’s subproblem over the primal variable at each iteration by an additional positive-definite quadratic proximal term, has been well studied in the literature. In this paper we show that it is not necessary to employ a positive-definite quadratic proximal term for the proximal ALM and the convergence can be still ensured if the positive definiteness is relaxed to indefiniteness by reducing the proximal parameter. An indefinite proximal version of the ALM is thus proposed for the generic setting of convex programming problems with linear constraints. We show that our relaxation is optimal in the sense that the proximal parameter cannot be further reduced. The consideration of indefinite proximal regularization is particularly meaningful for generating larger step sizes in solving ALM’s primal subproblems. When the model under discussion is separable in the sense that its objective function consists of finitely many additive function components without coupled variables, it is desired to decompose each ALM’s subproblem over the primal variable in Jacobian manner, replacing the original one by a sequence of easier and smaller decomposed subproblems, so that parallel computation can be applied. This full Jacobian splitting version of the ALM is known to be not necessarily convergent, and it has been studied in the literature that its convergence can be ensured if all the decomposed subproblems are further regularized by sufficiently large proximal terms. But how small the proximal parameter could be is still open. The other purpose of this paper is to show the smallest proximal parameter for the full Jacobian splitting version of ALM for solving multi-block separable convex minimization models.

2020 ◽  
Vol 2020 ◽  
pp. 1-10
Author(s):  
Jing Liu ◽  
Yongrui Duan ◽  
Tonghui Wang

The augmented Lagrangian method (ALM) is one of the most successful first-order methods for convex programming with linear equality constraints. To solve the two-block separable convex minimization problem, we always use the parallel splitting ALM method. In this paper, we will show that no matter how small the step size and the penalty parameter are, the convergence of the parallel splitting ALM is not guaranteed. We propose a new convergent parallel splitting ALM (PSALM), which is the regularizing ALM’s minimization subproblem by some simple proximal terms. In application this new PSALM is used to solve video background extraction problems and our numerical results indicate that this new PSALM is efficient.


2015 ◽  
Vol 32 (01) ◽  
pp. 1540008 ◽  
Author(s):  
Lei Yang ◽  
Zheng-Hai Huang ◽  
Yu-Fan Li

This paper studies a recovery task of finding a low multilinear-rank tensor that fulfills some linear constraints in the general settings, which has many applications in computer vision and graphics. This problem is named as the low multilinear-rank tensor recovery problem. The variable splitting technique and convex relaxation technique are used to transform this problem into a tractable constrained optimization problem. Considering the favorable structure of the problem, we develop a splitting augmented Lagrangian method (SALM) to solve the resulting problem. The proposed algorithm is easily implemented and its convergence can be proved under some conditions. Some preliminary numerical results on randomly generated and real completion problems show that the proposed algorithm is very effective and robust for tackling the low multilinear-rank tensor completion problem.


2020 ◽  
Vol 14 ◽  
pp. 174830262097353
Author(s):  
Noppadol Chumchob ◽  
Ke Chen

Variational methods for image registration basically involve a regularizer to ensure that the resulting well-posed problem admits a solution. Different choices of regularizers lead to different deformations. On one hand, the conventional regularizers, such as the elastic, diffusion and curvature regularizers, are able to generate globally smooth deformations and generally useful for many applications. On the other hand, these regularizers become poor in some applications where discontinuities or steep gradients in the deformations are required. As is well-known, the total (TV) variation regularizer is more appropriate to preserve discontinuities of the deformations. However, it is difficult in developing an efficient numerical method to ensure that numerical solutions satisfy this requirement because of the non-differentiability and non-linearity of the TV regularizer. In this work we focus on computational challenges arising in approximately solving TV-based image registration model. Motivated by many efficient numerical algorithms in image restoration, we propose to use augmented Lagrangian method (ALM). At each iteration, the computation of our ALM requires to solve two subproblems. On one hand for the first subproblem, it is impossible to obtain exact solution. On the other hand for the second subproblem, it has a closed-form solution. To this end, we propose an efficient nonlinear multigrid (NMG) method to obtain an approximate solution to the first subproblem. Numerical results on real medical images not only confirm that our proposed ALM is more computationally efficient than some existing methods, but also that the proposed ALM delivers the accurate registration results with the desired property of the constructed deformations in a reasonable number of iterations.


Sign in / Sign up

Export Citation Format

Share Document