local linear convergence
Recently Published Documents


TOTAL DOCUMENTS

21
(FIVE YEARS 9)

H-INDEX

7
(FIVE YEARS 0)

2021 ◽  
Author(s):  
Min Tao ◽  
Xiao-Ping Zhang

<div>In this paper, we carry out a unified study for L_1 over L_2 sparsity promoting models, which are widely used in the regime of coherent dictionaries for recovering sparse nonnegative/arbitrary signal. First, we provide the exact recovery condition on both the constrained and the unconstrained models for a broad set of signals. Next, we prove the solution existence of these L_{1}/L_{2} models under the assumption that the null space of the measurement matrix satisfies the $s$-spherical section property. Then by deriving an analytical solution for the proximal operator of the L_{1} / L_{2} with nonnegative constraint, we develop a new alternating direction method of multipliers based method (ADMM$_p^+$) to solve the unconstrained model. We establish its global convergence to a d-stationary solution (sharpest stationary) and its local linear convergence under certain conditions. Numerical simulations on two specific applications confirm the superior of ADMM$_p^+$ over the state-of-the-art methods in sparse recovery. ADMM$_p^+$ reduces computational time by about $95\%\sim99\%$ while achieving a much higher accuracy compared to commonly used scaled gradient projection method for wavelength misalignment problem.</div>


2021 ◽  
Author(s):  
Min Tao ◽  
Xiao-Ping Zhang

<div>In this paper, we carry out a unified study for L_1 over L_2 sparsity promoting models, which are widely used in the regime of coherent dictionaries for recovering sparse nonnegative/arbitrary signal. First, we provide the exact recovery condition on both the constrained and the unconstrained models for a broad set of signals. Next, we prove the solution existence of these L_{1}/L_{2} models under the assumption that the null space of the measurement matrix satisfies the $s$-spherical section property. Then by deriving an analytical solution for the proximal operator of the L_{1} / L_{2} with nonnegative constraint, we develop a new alternating direction method of multipliers based method (ADMM$_p^+$) to solve the unconstrained model. We establish its global convergence to a d-stationary solution (sharpest stationary) and its local linear convergence under certain conditions. Numerical simulations on two specific applications confirm the superior of ADMM$_p^+$ over the state-of-the-art methods in sparse recovery. ADMM$_p^+$ reduces computational time by about $95\%\sim99\%$ while achieving a much higher accuracy compared to commonly used scaled gradient projection method for wavelength misalignment problem.</div>


2019 ◽  
Vol 9 (4) ◽  
pp. 785-811
Author(s):  
Saiprasad Ravishankar ◽  
Anna Ma ◽  
Deanna Needell

Abstract Sparsity-based models and techniques have been exploited in many signal processing and imaging applications. Data-driven methods based on dictionary and sparsifying transform learning enable learning rich image features from data and can outperform analytical models. In particular, alternating optimization algorithms have been popular for learning such models. In this work, we focus on alternating minimization for a specific structured unitary sparsifying operator learning problem and provide a convergence analysis. While the algorithm converges to the critical points of the problem generally, our analysis establishes under mild assumptions, the local linear convergence of the algorithm to the underlying sparsifying model of the data. Analysis and numerical simulations show that our assumptions hold for standard probabilistic data models. In practice, the algorithm is robust to initialization.


Optimization ◽  
2018 ◽  
Vol 67 (6) ◽  
pp. 821-853 ◽  
Author(s):  
Jingwei Liang ◽  
Jalal Fadili ◽  
Gabriel Peyré

Sign in / Sign up

Export Citation Format

Share Document