ℓ1 minimization
Recently Published Documents


TOTAL DOCUMENTS

37
(FIVE YEARS 13)

H-INDEX

10
(FIVE YEARS 1)

Mathematics ◽  
2021 ◽  
Vol 9 (24) ◽  
pp. 3224
Author(s):  
Sining Huang ◽  
Yupeng Chen ◽  
Tiantian Qiao

This paper proposes an effective extended reweighted ℓ1 minimization algorithm (ERMA) to solve the basis pursuit problem minu∈Rnu1:Au=f in compressed sensing, where A∈Rm×n, m≪n. The fast algorithm is based on linearized Bregman iteration with soft thresholding operator and generalized inverse iteration. At the same time, it also combines the iterative reweighted strategy that is used to solve minu∈Rnupp:Au=f problem, with the weight ωiu,p=ε+ui2p/2−1. Numerical experiments show that this l1 minimization persistently performs better than other methods. Especially when p=0, the restored signal by the algorithm has the highest signal to noise ratio. Additionally, this approach has no effect on workload or calculation time when matrix A is ill-conditioned.


Author(s):  
Aaron Berk ◽  
Yaniv Plan ◽  
Özgür Yilmaz

Abstract The use of generalized Lasso is a common technique for recovery of structured high-dimensional signals. There are three common formulations of generalized Lasso; each program has a governing parameter whose optimal value depends on properties of the data. At this optimal value, compressed sensing theory explains why Lasso programs recover structured high-dimensional signals with minimax order-optimal error. Unfortunately in practice, the optimal choice is generally unknown and must be estimated. Thus, we investigate stability of each of the three Lasso programs with respect to its governing parameter. Our goal is to aid the practitioner in answering the following question: given real data, which Lasso program should be used? We take a step towards answering this by analysing the case where the measurement matrix is identity (the so-called proximal denoising setup) and we use $\ell _{1}$ regularization. For each Lasso program, we specify settings in which that program is provably unstable with respect to its governing parameter. We support our analysis with detailed numerical simulations. For example, there are settings where a 0.1% underestimate of a Lasso parameter can increase the error significantly and a 50% underestimate can cause the error to increase by a factor of $10^{9}$.


2020 ◽  
Vol 171 ◽  
pp. 107487
Author(s):  
Dunja Alexandra Hage ◽  
Miguel Heredia Conde ◽  
Otmar Loffeld

Sign in / Sign up

Export Citation Format

Share Document