scholarly journals An inexact accelerated stochastic ADMM for separable convex optimization

Author(s):  
Jianchao Bai ◽  
William W. Hager ◽  
Hongchao Zhang
2021 ◽  
pp. 1-28
Author(s):  
Yuan Shen ◽  
Yannian Zuo ◽  
Liming Sun ◽  
Xiayang Zhang

We consider the linearly constrained separable convex optimization problem whose objective function is separable with respect to [Formula: see text] blocks of variables. A bunch of methods have been proposed and extensively studied in the past decade. Specifically, a modified strictly contractive Peaceman–Rachford splitting method (SC-PRCM) [S. H. Jiang and M. Li, A modified strictly contractive Peaceman–Rachford splitting method for multi-block separable convex programming, J. Ind. Manag. Optim. 14(1) (2018) 397-412] has been well studied in the literature for the special case of [Formula: see text]. Based on the modified SC-PRCM, we present modified proximal symmetric ADMMs (MPSADMMs) to solve the multi-block problem. In MPSADMMs, all subproblems but the first one are attached with a simple proximal term, and the multipliers are updated twice. At the end of each iteration, the output is corrected via a simple correction step. Without stringent assumptions, we establish the global convergence result and the [Formula: see text] convergence rate in the ergodic sense for the new algorithms. Preliminary numerical results show that our proposed algorithms are effective for solving the linearly constrained quadratic programming and the robust principal component analysis problems.


Sign in / Sign up

Export Citation Format

Share Document