scholarly journals New Lossless Compression Method using Cyclic Reversible Low Contrast Mapping (CRLCM)

Author(s):  
Hendra Mesra ◽  
Handayani Tjandrasa ◽  
Chastine Fatichah

<p>In general, the compression method is developed to reduce the redundancy of data. This study uses a different approach to embed some bits of datum in image data into other datum using a Reversible Low Contrast Mapping (RLCM) transformation. Besides using the RLCM for embedding, this method also applies the properties of RLCM to compress the datum before it is embedded. In its algorithm, the proposed method engages Queue and Recursive Indexing. The algorithm encodes the data in a cyclic manner. In contrast to RLCM, the proposed method is a coding method as Huffman coding. This research uses publicly available image data to examine the proposed method. For all testing images, the proposed method has higher compression ratio than the Huffman coding.</p>

Author(s):  
Hendra Mesra ◽  
Handayani Tjandrasa ◽  
Chastine Fatichah

<p>In general, the compression method is developed to reduce the redundancy of data. This study uses a different approach to embed some bits of datum in image data into other datum using a Reversible Low Contrast Mapping (RLCM) transformation. Besides using the RLCM for embedding, this method also applies the properties of RLCM to compress the datum before it is embedded. In its algorithm, the proposed method engages Queue and Recursive Indexing. The algorithm encodes the data in a cyclic manner. In contrast to RLCM, the proposed method is a coding method as Huffman coding. This research uses publicly available image data to examine the proposed method. For all testing images, the proposed method has higher compression ratio than the Huffman coding.</p>


2014 ◽  
Vol 926-930 ◽  
pp. 1751-1754
Author(s):  
Hong Mei Song ◽  
Hai Wei Mu ◽  
Dong Yan Zhao

A progressive transmission and decoding nearly lossless compression algorithm is proposed. The image data are grouped according to different frequencies based on DCT transform, then it uses the JPEG-LS core algorithmtexture prediction and Golomb coding on each group of data, in order to achieve progressive image transmission and decoding. Experimentation on the standard test images with this algorithm and comparing with JPEG-LS shows that the compression ratio of this algorithm is very similar to the compression ratio of JPEG-LS, and this algorithm loses a little image information but it has the ability of the progressive transmission and decoding.


Information ◽  
2020 ◽  
Vol 11 (4) ◽  
pp. 196
Author(s):  
Shmuel T. Klein ◽  
Dana Shapira

It seems reasonable to expect from a good compression method that its output should not be further compressible, because it should behave essentially like random data. We investigate this premise for a variety of known lossless compression techniques, and find that, surprisingly, there is much variability in the randomness, depending on the chosen method. Arithmetic coding seems to produce perfectly random output, whereas that of Huffman or Ziv-Lempel coding still contains many dependencies. In particular, the output of Huffman coding has already been proven to be random under certain conditions, and we present evidence here that arithmetic coding may produce an output that is identical to that of Huffman.


2020 ◽  
Vol 10 (14) ◽  
pp. 4918
Author(s):  
Shaofei Dai ◽  
Wenbo Liu ◽  
Zhengyi Wang ◽  
Kaiyu Li ◽  
Pengfei Zhu ◽  
...  

This paper reports on an efficient lossless compression method for periodic signals based on adaptive dictionary predictive coding. Some previous methods for data compression, such as difference pulse coding (DPCM), discrete cosine transform (DCT), lifting wavelet transform (LWT) and KL transform (KLT), lack a suitable transformation method to make these data less redundant and better compressed. A new predictive coding approach, basing on the adaptive dictionary, is proposed to improve the compression ratio of the periodic signal. The main criterion of lossless compression is the compression ratio (CR). In order to verify the effectiveness of the adaptive dictionary predictive coding for periodic signal compression, different transform coding technologies, including DPCM, 2-D DCT, and 2-D LWT, are compared. The results obtained prove that the adaptive dictionary predictive coding can effectively improve data compression efficiency compared with traditional transform coding technology.


2003 ◽  
Vol 13 (01) ◽  
pp. 39-45
Author(s):  
AMER AL-NASSIRI

In this paper we considered a theoretical evaluation of data and text compression algorithm based on the Burrows–Wheeler Transform (BWT) and General Bidirectional Associative Memory (GBAM). A new data and text lossless compression method, based on the combination of BWT1 and GBAM2 approaches, is presented. The algorithm was tested on many texts in different formats (ASCII and RTF). The compression ratio achieved is fairly good, on average 28–36%. Decompression is fast.


2016 ◽  
Vol 13 (10) ◽  
pp. 6671-6679
Author(s):  
H Rajasekhar ◽  
B. Prabhakara Rao

In the previous video compression method, the videos were segmented by using the novel motion estimation algorithm with aid of watershed method. But, the compression ratio (CR) of compression with novel motion estimation algorithm was not giving an adequate result. Moreover this methods performance is needed to be improved in the encoding and decoding processes. Because most of the video compression methods have utilized encoding techniques like JPEG, Run Length, Huffman coding and LSK encoding. The improvement of the encoding techniques in the compression process will improve the compression result. Hence, to overcome these drawbacks, we intended to propose a new video compression method with renowned encoding technique. In this proposed video compression method, the input video frames motion vectors are estimated by applying watershed and ARS-ST (Adaptive Rood Search with Spatio-Temporal) algorithms. After that, the vector blocks which have high difference value are encoded by using the JPEG-LS encoder. JPEG-LS have excellent coding and computational efficiency, and it outperforms JPEG2000 and many other image compression methods. This algorithm is of relatively low complexity, low storage requirement and its compression capability is efficient enough. To get the compressed video, the encoded blocks are subsequently decoded by JPEG-LS. The implementation result shows the effectiveness of proposed method, in compressing more number of videos. The performance of our proposed video compression method is evaluated by comparing the result of proposed method with the existing video compression techniques. The comparison result shows that our proposed method acquires high-quality compression ratio and PSNR for the number of testing videos than the existing techniques.


2019 ◽  
Vol 2019 ◽  
pp. 1-11 ◽  
Author(s):  
Romi Fadillah Rahmat ◽  
T. S. M. Andreas ◽  
Fahmi Fahmi ◽  
Muhammad Fermi Pasha ◽  
Mohammed Yahya Alzahrani ◽  
...  

Compression, in general, aims to reduce file size, with or without decreasing data quality of the original file. Digital Imaging and Communication in Medicine (DICOM) is a medical imaging file standard used to store multiple information such as patient data, imaging procedures, and the image itself. With the rising usage of medical imaging in clinical diagnosis, there is a need for a fast and secure method to share large number of medical images between healthcare practitioners, and compression has always been an option. This work analyses the Huffman coding compression method, one of the lossless compression techniques, as an alternative method to compress a DICOM file in open PACS settings. The idea of the Huffman coding compression method is to provide codeword with less number of bits for the symbol that has a higher value of byte frequency distribution. Experiments using different type of DICOM images are conducted, and the analysis on the performances in terms of compression ratio and compression/decompression time, as well as security, is provided. The experimental results showed that the Huffman coding technique has the capability to compress the DICOM file up to 1 : 3.7010 ratio and up to 72.98% space savings.


Author(s):  
Emy Setyaningsih ◽  
Agus Harjoko

A compression process is to reduce or compress the size of data while maintaining the quality of information contained therein. This paper presents a survey of research papers discussing improvement of various hybrid compression techniques during the last decade. A hybrid compression technique is a technique combining excellent properties of each group of methods as is performed in JPEG compression method. This technique combines lossy and lossless compression method to obtain a high-quality compression ratio while maintaining the quality of the reconstructed image. Lossy compression technique produces a relatively high compression ratio, whereas lossless compression brings about high-quality data reconstruction as the data can later be decompressed with the same results as before the compression. Discussions of the knowledge of and issues about the ongoing hybrid compression technique development indicate the possibility of conducting further researches to improve the performance of image compression method.


2015 ◽  
Vol 10 (2) ◽  
Author(s):  
Christian Puji Nugraha ◽  
R. Gunawan Santosa ◽  
Lukas Chrisantyo A.A.

Data compression is a very important process in the world that has been vastly using digital files, such as for texts, images, sounds or videos. Those digital files has a varied size and often taking disk storage spaces. To overcome this problem, many experts created compression algorithms, both for lossy and lossless compression. This research discusses about testing of four lossless compression algorithms that applied for text files, such as LZ77, Static Huffman, LZ77 combined with Static Huffman, and Deflate. Performance comparison of the four algorithms is measured by obtaining the compression ratio. From the test results can be concluded that the Deflate algorithm is the best algorithm due to the use of multiple modes, i.e. uncompressed mode, LZ77 combined with Static Huffman mode, and LZ77 combined with Dynamic Huffman Coding mode. The results also showed that the Deflate algorithm can compress text files and generates an average compression ratio of 38.84%.


Author(s):  
Andreas Soegandi

The purpose of this study was to perform lossless compression on the uncompress audio file audio to minimize file size without reducing the quality. The application is developed using the entropy encoding compression method with rice coding technique. For the result, the compression ratio is good enough and easy to be developed because the algorithm is quite simple. 


Sign in / Sign up

Export Citation Format

Share Document