TEXT COMPRESSION USING HYBRIDS OF BWT AND GBAM

2003 ◽  
Vol 13 (01) ◽  
pp. 39-45
Author(s):  
AMER AL-NASSIRI

In this paper we considered a theoretical evaluation of data and text compression algorithm based on the Burrows–Wheeler Transform (BWT) and General Bidirectional Associative Memory (GBAM). A new data and text lossless compression method, based on the combination of BWT1 and GBAM2 approaches, is presented. The algorithm was tested on many texts in different formats (ASCII and RTF). The compression ratio achieved is fairly good, on average 28–36%. Decompression is fast.

2014 ◽  
Vol 926-930 ◽  
pp. 1751-1754
Author(s):  
Hong Mei Song ◽  
Hai Wei Mu ◽  
Dong Yan Zhao

A progressive transmission and decoding nearly lossless compression algorithm is proposed. The image data are grouped according to different frequencies based on DCT transform, then it uses the JPEG-LS core algorithmtexture prediction and Golomb coding on each group of data, in order to achieve progressive image transmission and decoding. Experimentation on the standard test images with this algorithm and comparing with JPEG-LS shows that the compression ratio of this algorithm is very similar to the compression ratio of JPEG-LS, and this algorithm loses a little image information but it has the ability of the progressive transmission and decoding.


Author(s):  
Rian Syahputra

Text compression is used to reduce the repetition of letters so that the storage space used becomes smaller and transmission becomes faster. Techniques in text compression that can be used are using lossless compression. Lossless compression is a type of compression where when the decompression process occurs no data is lost. The Even-Rodeh Code algorithm is a lossless type of compression algorithm that replaces the initial bit code of the data with a code from the Even-Rodeh Code algorithm so that it can produce smaller data sizes


Author(s):  
Hendra Mesra ◽  
Handayani Tjandrasa ◽  
Chastine Fatichah

<p>In general, the compression method is developed to reduce the redundancy of data. This study uses a different approach to embed some bits of datum in image data into other datum using a Reversible Low Contrast Mapping (RLCM) transformation. Besides using the RLCM for embedding, this method also applies the properties of RLCM to compress the datum before it is embedded. In its algorithm, the proposed method engages Queue and Recursive Indexing. The algorithm encodes the data in a cyclic manner. In contrast to RLCM, the proposed method is a coding method as Huffman coding. This research uses publicly available image data to examine the proposed method. For all testing images, the proposed method has higher compression ratio than the Huffman coding.</p>


2020 ◽  
Vol 10 (14) ◽  
pp. 4918
Author(s):  
Shaofei Dai ◽  
Wenbo Liu ◽  
Zhengyi Wang ◽  
Kaiyu Li ◽  
Pengfei Zhu ◽  
...  

This paper reports on an efficient lossless compression method for periodic signals based on adaptive dictionary predictive coding. Some previous methods for data compression, such as difference pulse coding (DPCM), discrete cosine transform (DCT), lifting wavelet transform (LWT) and KL transform (KLT), lack a suitable transformation method to make these data less redundant and better compressed. A new predictive coding approach, basing on the adaptive dictionary, is proposed to improve the compression ratio of the periodic signal. The main criterion of lossless compression is the compression ratio (CR). In order to verify the effectiveness of the adaptive dictionary predictive coding for periodic signal compression, different transform coding technologies, including DPCM, 2-D DCT, and 2-D LWT, are compared. The results obtained prove that the adaptive dictionary predictive coding can effectively improve data compression efficiency compared with traditional transform coding technology.


Author(s):  
Kamal Al-Khayyat ◽  
Imad Al-Shaikhli ◽  
Mohamad Al-Hagery

This paper details the examination of a particular case of data compression, where the compression algorithm removes the redundancy from data, which occurs when edge-based compression algorithms compress (previously compressed) pixelated images. The newly created redundancy can be removed using another round of compression. This work utilized the JPEG-LS as an example of an edge-based compression algorithm for compressing pixelated images. The output of this process was subjected to another round of compression using a more robust but slower compressor (PAQ8f). The compression ratio of the second compression was, on average,  18%, which is high for random data. The results of the second compression were superior to the lossy JPEG. Under the used data set, lossy JPEG needs to sacrifice  10% on average to realize nearly total lossless compression ratios of the two-successive compressions. To generalize the results, fast general-purpose compression algorithms (7z, bz2, and Gzip) were used too.


Symmetry ◽  
2020 ◽  
Vol 12 (10) ◽  
pp. 1654
Author(s):  
Md. Atiqur Rahman ◽  
Mohamed Hamada

Text compression is one of the most significant research fields, and various algorithms for text compression have already been developed. This is a significant issue, as the use of internet bandwidth is considerably increasing. This article proposes a Burrows–Wheeler transform and pattern matching-based lossless text compression algorithm that uses Huffman coding in order to achieve an excellent compression ratio. In this article, we introduce an algorithm with two keys that are used in order to reduce more frequently repeated characters after the Burrows–Wheeler transform. We then find patterns of a certain length from the reduced text and apply Huffman encoding. We compare our proposed technique with state-of-the-art text compression algorithms. Finally, we conclude that the proposed technique demonstrates a gain in compression ratio when compared to other compression techniques. A small problem with our proposed method is that it does not work very well for symmetric communications like Brotli.


Author(s):  
Manasi Rath ◽  
Suvendu Rup

<em>This paper is a methodological review paper on image compression using Burrows Wheeler Transform. Normally BWT is used for text compression but it has been recently applied to image compression field. Basically it is a lossless compression technique which is used for high level resolution.This paper proposes about several scheme added with BWT to improve the performance of image compression which helps us to formulate a new technique for the further improvement in BWT. Here many authorshave different type of representation of BWT for better compression.</em>


2018 ◽  
Vol 7 (2.31) ◽  
pp. 69 ◽  
Author(s):  
G Murugesan ◽  
Rosario Gilmary

Text files utilize substantial amount of memory or disk space. Transmission of these files across a network depends upon a considerable amount of bandwidth. Compression procedures are explicitly advantageous in telecommunications and information technology because it facilitate devices to disseminate or reserve the equivalent amount of data in fewer bits. Text compression techniques section, the English passage by observing the patters and provide alternative symbols for larger patters of text. To diminish the depository of copious information and data storage expenditure, compression algorithms were used. Compression of significant and massive cluster of information can head to the improvement in retrieval time. Novel lossless compression algorithms have been introduced for better compression ratio. In this work, the various existing compression mechanisms that are particular for compressing the text files and Deoxyribonucleic acid (DNA) sequence files are analyzed. The performance is correlated in terms of compression ratio, time taken to compress/decompress the sequence and file size. In this proposed work, the input file is converted to DNA format and then DNA compression procedure is applied.  


Author(s):  
Hendra Mesra ◽  
Handayani Tjandrasa ◽  
Chastine Fatichah

<p>In general, the compression method is developed to reduce the redundancy of data. This study uses a different approach to embed some bits of datum in image data into other datum using a Reversible Low Contrast Mapping (RLCM) transformation. Besides using the RLCM for embedding, this method also applies the properties of RLCM to compress the datum before it is embedded. In its algorithm, the proposed method engages Queue and Recursive Indexing. The algorithm encodes the data in a cyclic manner. In contrast to RLCM, the proposed method is a coding method as Huffman coding. This research uses publicly available image data to examine the proposed method. For all testing images, the proposed method has higher compression ratio than the Huffman coding.</p>


2016 ◽  
Vol 12 (2) ◽  
Author(s):  
Yosia Adi Jaya ◽  
Lukas Chrisantyo ◽  
Willy Sudiarto Raharjo

Data Compression can save some storage space and accelerate data transfer. Among many compression algorithm, Run Length Encoding (RLE) is a simple and fast algorithm. RLE can be used to compress many types of data. However, RLE is not very effective for image lossless compression because there are many little differences between neighboring pixels. This research proposes a new lossless compression algorithm called YRL that improve RLE using the idea of Relative Encoding. YRL can treat the value of neighboring pixels as the same value by saving those little differences / relative value separately. The test done by using various standard image test shows that YRL have an average compression ratio of 75.805% for 24-bit bitmap and 82.237% for 8-bit bitmap while RLE have an average compression ratio of 100.847% for 24-bit bitmap and 97.713% for 8-bit bitmap.


Sign in / Sign up

Export Citation Format

Share Document