Solving Separable Convex Optimization Problem Based on Generalized Proximal Alternating Direction Method of Multipliers

2021 ◽  
Vol 11 (04) ◽  
pp. 485-495
Author(s):  
倩雯 殷
2013 ◽  
Vol 25 (8) ◽  
pp. 2172-2198 ◽  
Author(s):  
Shiqian Ma ◽  
Lingzhou Xue ◽  
Hui Zou

Chandrasekaran, Parrilo, and Willsky ( 2012 ) proposed a convex optimization problem for graphical model selection in the presence of unobserved variables. This convex optimization problem aims to estimate an inverse covariance matrix that can be decomposed into a sparse matrix minus a low-rank matrix from sample data. Solving this convex optimization problem is very challenging, especially for large problems. In this letter, we propose two alternating direction methods for solving this problem. The first method is to apply the classic alternating direction method of multipliers to solve the problem as a consensus problem. The second method is a proximal gradient-based alternating-direction method of multipliers. Our methods take advantage of the special structure of the problem and thus can solve large problems very efficiently. A global convergence result is established for the proposed methods. Numerical results on both synthetic data and gene expression data show that our methods usually solve problems with 1 million variables in 1 to 2 minutes and are usually 5 to 35 times faster than a state-of-the-art Newton-CG proximal point algorithm.


2015 ◽  
Vol 2015 ◽  
pp. 1-14 ◽  
Author(s):  
Lu Li ◽  
Xingyu Wang ◽  
Guoqiang Wang

The alternating direction method of multipliers (ADMM) has been widely explored due to its broad applications, and its convergence has been gotten in the real field. In this paper, an ADMM is presented for separable convex optimization of real functions in complex variables. First, the convergence of the proposed method in the complex domain is established by using the Wirtinger Calculus technique. Second, the basis pursuit (BP) algorithm is given in the form of ADMM in which the projection algorithm and the soft thresholding formula are generalized from the real case. The numerical simulations on the reconstruction of electroencephalogram (EEG) signal are provided to show that our new ADMM has better behavior than the classic ADMM for solving separable convex optimization of real functions in complex variables.


2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Xin-Rong Lv ◽  
Youming Li ◽  
Yu-Cheng He

An efficient impulsive noise estimation algorithm based on alternating direction method of multipliers (ADMM) is proposed for OFDM systems using quadrature amplitude modulation (QAM). Firstly, we adopt the compressed sensing (CS) method based on the l1-norm optimization to estimate impulsive noise. Instead of the conventional methods that exploit only the received signal in null tones as constraint, we add the received signal of data tones and QAM constellations as constraints. Then a relaxation approach is introduced to convert the discrete constellations to the convex box constraints. After that a linear programming is used to solve the optimization problem. Finally, a framework of ADMM is developed to solve the problem in order to reduce the computation complexity. Simulation results for 4-QAM and 16-QAM demonstrate the practical advantages of the proposed algorithm over the other algorithms in bit error rate performance gains.


Sign in / Sign up

Export Citation Format

Share Document