scholarly journals Algorithms on Sparse Representation

2018 ◽  
Vol 7 (4.36) ◽  
pp. 569
Author(s):  
D. Khalandar Basha ◽  
T. Venkateswarlu

Representation of signals and images in sparse become more interesting for various applications like restoration, compression and recognition. Many researches carried out in the era of sparse representation. Sparse represents signal or image as a few elements from the dictionary atoms. There are various algorithms proposed by researchers for learning dictionary. This paper discuss some of the terms related to sparse like regularization term, minimization, minimization,  minimizationfollowed by the pursuit algorithms for solving  problem, greedy algorithms and relaxation algorithms. This paper gives algorithmic approaches for the algorithms.  

Electronics ◽  
2022 ◽  
Vol 11 (2) ◽  
pp. 182
Author(s):  
Rongfang Wang ◽  
Yali Qin ◽  
Zhenbiao Wang ◽  
Huan Zheng

Achieving high-quality reconstructions of images is the focus of research in image compressed sensing. Group sparse representation improves the quality of reconstructed images by exploiting the non-local similarity of images; however, block-matching and dictionary learning in the image group construction process leads to a long reconstruction time and artifacts in the reconstructed images. To solve the above problems, a joint regularized image reconstruction model based on group sparse representation (GSR-JR) is proposed. A group sparse coefficients regularization term ensures the sparsity of the group coefficients and reduces the complexity of the model. The group sparse residual regularization term introduces the prior information of the image to improve the quality of the reconstructed image. The alternating direction multiplier method and iterative thresholding algorithm are applied to solve the optimization problem. Simulation experiments confirm that the optimized GSR-JR model is superior to other advanced image reconstruction models in reconstructed image quality and visual effects. When the sensing rate is 0.1, compared to the group sparse residual constraint with a nonlocal prior (GSRC-NLR) model, the gain of the peak signal-to-noise ratio (PSNR) and structural similarity (SSIM) is up to 4.86 dB and 0.1189, respectively.


Author(s):  
Lijuan Song

In view of the complex background of images and the segmentation difficulty, a sparse representation and supervised discriminative learning were applied to image segmentation. The sparse and over-complete representation can represent images in a compact and efficient manner. Most atom coefficients are zero, only a few coefficients are large, and the nonzero coefficient can reveal the intrinsic structures and essential properties of images. Therefore, sparse representations are beneficial to subsequent image processing applications. We first described the sparse representation theory. This study mainly revolved around three aspects, namely a trained dictionary, greedy algorithms, and the application of the sparse representation model in image segmentation based on supervised discriminative learning. Finally, we performed an image segmentation experiment on standard image datasets and natural image datasets. The main focus of this thesis was supervised discriminative learning, and the experimental results showed that the proposed algorithm was optimal, sparse, and efficient.


Computers ◽  
2021 ◽  
Vol 10 (3) ◽  
pp. 28
Author(s):  
Julián Moreno Cadavid ◽  
Hernán Darío Vanegas Madrigal

There is always an increasing demand for data storage and transfer; therefore, data compression will always be a fundamental need. In this article, we propose a lossless data compression method focused on a particular kind of data, namely, chat messages, which are typically non-formal, short-length strings. This method can be considered a hybrid because it combines two different algorithmic approaches: greedy algorithms, specifically Huffman coding, on the one hand and dynamic programming on the other (HCDP = Huffman Coding + Dynamic Programming). The experimental results demonstrated that our method provided lower compression ratios when compared with six reference algorithms, with reductions between 23.7% and 39.7%, whilst the average remained below the average value reported in several related works found in the literature. Such performance carries a sacrifice in speed, however, which does not presume major practical implications in the context of short-length strings.


2020 ◽  
Vol 84 (11) ◽  
pp. 1335-1340
Author(s):  
P. Kasprzak ◽  
K. Kazimierczuk ◽  
A. L. Shchukina
Keyword(s):  

2010 ◽  
Vol 30 (11) ◽  
pp. 2956-2958
Author(s):  
Xue-song XU ◽  
Ling-juan LI ◽  
Li-wei GUO

Sign in / Sign up

Export Citation Format

Share Document