Comparison of l1-Minimization and Iteratively Reweighted least Squares-l p-Minimization for Image Reconstruction from Compressive Sensing

Author(s):  
Oey Endra ◽  
Dadang Gunawan
2009 ◽  
Vol 57 (6) ◽  
pp. 2424-2431 ◽  
Author(s):  
C.J. Miosso ◽  
R. von Borries ◽  
M. Argaez ◽  
L. Velazquez ◽  
C. Quintero ◽  
...  

2020 ◽  
Author(s):  
Jorge Cormane ◽  
Camila Franco de Sousa

Este trabalho apresenta um método de compressão de sinais da rede elétrica baseado na técnica de Compressive Sensing combinada com uma abordagem dissociativa. Para isso, utilizam-se os algoritmos Iteratively Reweighted Least-Squares e o Conjugate Gradient. O primeiro adequado para a reconstrução de sinais unidimensionais, enquanto que o segundo é adequado para a reconstrução do sinal em um formato bidimensional. Os resultados demonstram a preservação do sinal após a reconstrução (SNR > 40 dB), além da redução da complexidade computacional, a partir da dissociação do sinal segundo seu comportamento: regime permanente ou disturbio.


2016 ◽  
Vol 78 (5) ◽  
Author(s):  
Indrarini Dyah Irawati ◽  
Andriyan B. Suksmono

We proposed compressive sensing to reduce the sampling rate of the image and improve the accuracy of image reconstruction. Compressive sensing requires that the representation of the image is sparse on a certain basis. We use wavelet transformation to provide sparsity matrix basis. Meanwhile, to get a projection matrix using a random orthonormal process. The algorithm used to reconstruct the image is orthogonal matching pursuit (OMP) and Iteratively Reweighted Least Squares (IRLS). The test result indicates that a high quality image is obtained along with the number of coefficients M. IRLS has a good performance on PSNR than OMP while OMP takes the least time for reconstruction.


2018 ◽  
Vol 7 (3) ◽  
pp. 563-579
Author(s):  
Paul Hand ◽  
Babhru Joshi

Abstract We introduce a convex approach for mixed linear regression over d features. This approach is a second-order cone program, based on L1 minimization, which assigns an estimate regression coefficient in $\mathbb {R}^{d}$ for each data point. These estimates can then be clustered using, for example, k-means. For problems with two or more mixture classes, we prove that the convex program exactly recovers all of the mixture components in the noiseless setting under technical conditions that include a well-separation assumption on the data. Under these assumptions, recovery is possible if each class has at least d-independent measurements. We also explore an iteratively reweighted least squares implementation of this method on real and synthetic data.


Sign in / Sign up

Export Citation Format

Share Document