lzw algorithm
Recently Published Documents


TOTAL DOCUMENTS

44
(FIVE YEARS 15)

H-INDEX

3
(FIVE YEARS 1)

Author(s):  
Jinhong Di ◽  
Pengkun Yang ◽  
Chunyan Wang ◽  
Lichao Yan

In order to overcome the problems of large error and low precision in traditional power fault record data compression, a new layered lossless compression method for massive fault record data is proposed in this paper. The algorithm applies LZW (Lempel Ziv Welch) algorithm, analyzes the LZW algorithm and existing problems, and improves the LZW algorithm. Use the index value of the dictionary to replace the input string sequence, and dynamically add unknown strings to the dictionary. The parallel search method is to divide the dictionary into several small dictionaries with different bit widths to realize the parallel search of the dictionary. According to the compression and decompression of LZW, the optimal compression effect of LZW algorithm hardware is obtained. The multi tree structure of the improved LZW algorithm is used to construct the dictionary, and the multi character parallel search method is used to query the dictionary. The multi character parallel search method is used to query the dictionary globally. At the same time, the dictionary size and update strategy of LZW algorithm are analyzed, and the optimization parameters are designed to construct and update the dictionary. Through the calculation of lossless dictionary compression, the hierarchical lossless compression of large-scale fault record data is completed. Select the optimal parameters, design the dictionary size and update strategy, and complete the lossless compression of recorded data. The experimental results show that compared with the traditional compression method, under this compression method, the mean square error percentage is effectively reduced, and the compression error and compression rate are eliminated, so as to ensure the integrity of fault record data, achieve the compression effect in a short time, and achieve the expected goal.


Author(s):  
Ivan Mozghovyi ◽  
Anatoliy Sergiyenko ◽  
Roman Yershov

Increasing requirements for data transfer and storage is one of the crucial questions now. There are several ways of high-speed data transmission, but they meet limited requirements applied to their narrowly focused specific target. The data compression approach gives the solution to the problems of high-speed transfer and low-volume data storage. This paper is devoted to the compression of GIF images, using a modified LZW algorithm with a tree-based dictionary. It has led to a decrease in lookup time and an increase in the speed of data compression, and in turn, allows developing the method of constructing a hardware compression accelerator during the future research.


Electronics ◽  
2021 ◽  
Vol 10 (11) ◽  
pp. 1267
Author(s):  
Yong Liu ◽  
Bing Li ◽  
Yan Zhang ◽  
Xia Zhao

With the developments of Internet of Things (IoT) and cloud-computing technologies, cloud servers need storage of a huge volume of IoT data with high throughput and robust security. Joint Compression and Encryption (JCAE) scheme based on Huffman algorithm has been regarded as a promising technology to enhance the data storage method. Existing JCAE schemes still have the following limitations: (1) The keys in the JCAE would be cracked by physical and cloning attacks; (2) The rebuilding of Huffman tree reduces the operational efficiency; (3) The compression ratio should be further improved. In this paper, a Huffman-based JCAE scheme using Physical Unclonable Functions (PUFs) is proposed. It provides physically secure keys with PUFs, efficient Huffman tree mutation without rebuilding, and practical compression ratio by combining the Lempel-Ziv and Welch (LZW) algorithm. The performance of the instanced PUFs and the derived keys was evaluated. Moreover, our scheme was demonstrated in a file protection system with the average throughput of 473Mbps and the average compression ratio of 0.5586. Finally, the security analysis shows that our scheme resists physical and cloning attacks as well as several classic attacks, thus improving the security level of existing data protection methods.


This paper proposes an improved data compression technique compared to existing Lempel-Ziv-Welch (LZW) algorithm. LZW is a dictionary-updation based compression technique which stores elements from the data in the form of codes and uses them when those strings recur again. When the dictionary gets full, every element in the dictionary are removed in order to update dictionary with new entry. Therefore, the conventional method doesn’t consider frequently used strings and removes all the entry. This method is not an effective compression when the data to be compressed are large and when there are more frequently occurring string. This paper presents two new methods which are an improvement for the existing LZW compression algorithm. In this method, when the dictionary gets full, the elements that haven’t been used earlier are removed rather than removing every element of the dictionary which happens in the existing LZW algorithm. This is achieved by adding a flag to every element of the dictionary. Whenever an element is used the flag is set high. Thus, when the dictionary gets full, the dictionary entries where the flag was set high are kept and others are discarded. In the first method, the entries are discarded abruptly, whereas in the second method the unused elements are removed once at a time. Therefore, the second method gives enough time for the nascent elements of the dictionary. These techniques all fetch similar results when data set is small. This happens due to the fact that difference in the way they handle the dictionary when it’s full. Thus these improvements fetch better results only when a relatively large data is used. When all the three techniques' models were used to compare a data set with yields best case scenario, the compression ratios of conventional LZW is small compared to improved LZW method-1 and which in turn is small compared to improved LZW method-2.


2020 ◽  
Vol 25 (4) ◽  
pp. 32-40
Author(s):  
Bouza M.K. ◽  

The article examines the algorithms for JPEG and JPEG-2000 compression of various graphic images. The main steps of the operation of both algorithms are given, their advantages and disadvantages are noted. The main differences between JPEG and JPEG-2000 are analyzed. It is noted that the JPEG-2000 algorithm allows re-moving visually unpleasant effects. This makes it possible to highlight important areas of the image and improve the quality of their compression. The features of each step of the algorithms are considered and the difficulties of their implementation are compared. The effectiveness of each algorithm is demonstrated by the example of a full-color image of the BSU emblem. The obtained compression ratios were obtained and shown in the corresponding tables using both algorithms. Compression ratios are obtained for a wide range of quality values from 1 to ten. We studied various types of images: black and white, business graphics, indexed and full color. A modified LZW-Lempel-Ziv-Welch algorithm is presented, which is applicable to compress a variety of information from text to images. The modification is based on limiting the graphic file to 256 colors. This made it possible to index the color with one byte instead of three. The efficiency of this modification grows with increasing image sizes. The modified LZW-algorithm can be adapted to any image from single-color to full-color. The prepared tests were indexed to the required number of colors in the images using the FastStone Image Viewer program. For each image, seven copies were obtained, containing 4, 8, 16, 32, 64, 128 and 256 colors, respectively. Testing results showed that the modified version of the LZW algorithm allows for an average of twice the compression ratio. However, in a class of full-color images, both algorithms showed the same results. The developed modification of the LZW algorithm can be successfully applied in the field of site design, especially in the case of so-called flat design. The comparative characteristics of the basic and modified methods are presented.


2020 ◽  
Vol 7 (2) ◽  
pp. 554-563
Author(s):  
Kazeem B. Adedeji

IoT-based smart water supply network management applications generate a huge volume of data from the installed sensing devices which are required to be processed (sometimes in-network), stored and transmitted to a remote centre for decision making. When the volume of data produced by diverse IoT smart sensing devices intensify, processing and storage of these data begin to be a serious issue. The large data size acquired from these applications increases the computational complexities, occupies the scarce bandwidth of data transmission and increases the storage space. Thus, data size reduction through the use of data compression algorithms is essential in IoT-based smart water network management applications. In this paper, the performance evaluation of four different data compression algorithms used for this purpose is presented. These algorithms, which include RLE, Huffman, LZW and Shanon-Fano encoding were realised using MATLAB software and tested on six water supply system data. The performance of each of these algorithms was evaluated based on their compression ratio, compression factor, percentage space savings, as well as the compression gain. The results obtained showed that the LZW algorithm shows better performance base on the compression ratio, compression factor, space savings and the compression gain. However, its execution time is relatively slow compared to the RLE and the two other algorithms investigated. Most importantly, the LZW algorithm has a significant reduction in the data sizes of the tested files than all other algorithms


2020 ◽  
Author(s):  
Abdulkarem Almawgani ◽  
Adam Alhawari ◽  
Wlaed Alarashi ◽  
Ali Alshwal

Abstract Digital images are commonly used in steganography due to the popularity of digital image transfer and exchange through the Internet. However, the tradeoff between managing high capacity of secret data and ensuring high security and quality of stego image is a major challenge. In this paper, a hybrid steganography method based on Haar Discrete Wavelet Transform (HDWT), Lempel Ziv Welch (LZW) algorithm, Genetic Algorithm (GA), and the Optimal Pixel Adjustment Process (OPAP) is proposed. The cover image is divided into non-overlapping blocks of nxn pixels. Then, the HDWT is used to increase the robustness of the stego image against attacks. In order to increase the capacity for, and security of, the hidden image, the LZW algorithm is applied on the secret message. After that, the GA is employed to give the encoded and compressed secret message cover image coefficients. The GA is used to find the optimal mapping function for each block in the image. Lastly, the OPAP is applied to reduce the error, i.e., the difference between the cover image blocks and the stego image blocks. This step is a further improvement to the stego image quality. The proposed method was evaluated using four standard images as covers and three types of secret messages. The results demonstrate higher visual quality of the stego image with a large size of embedded secret data than what is generated by already-known techniques. The experimental results show that the information-hiding capacity of the proposed method reached to 50% with high PSNR (52.83 dB). Thus, the herein proposed hybrid image steganography method improves the quality of the stego image over those of the state-of-the-art methods.


2020 ◽  
Vol 27 (1) ◽  
Author(s):  
MB Ibrahim ◽  
KA Gbolagade

The science and art of data compression is presenting information in a compact form. This compact representation of information is generated by recognizing the use of structures that exist in the data. The Lempel-Ziv-Welch (LZW) algorithm is known to be one of the best compressors of text which achieve a high degree of compression. This is possible for text files with lots of redundancies. Thus, the greater the redundancies, the greater the compression achieved. In this paper, the LZW algorithm is further enhanced to achieve a higher degree of compression without compromising its performances through the introduction of an algorithm, called Chinese Remainder Theorem (CRT), is presented. Compression Time and Compression Ratio was used for performance metrics. Simulations was carried out using MATLAB for five (5) text files (of varying sizes) in determining the efficiency of the proposed CRT-LZW technique. This new technique has opened a new development of increasing the speed of compressing data than the traditional LZW. The results show that the CRT-LZW performs better than LZW in terms of computational time by 0.12s to 15.15s, while the compression ratio remains same with 2.56% respectively. The proposed compression time also performed better than some investigative papers implementing LZW-RNS by 0.12s to 2.86s and another by 0.12s to 0.14s. Keywords: Data Compression, Lempel-Ziv-Welch (LZW) algorithm, Enhancement, Chinese Remainder Theorem (CRT), Text files.


2020 ◽  
Vol 17 (2) ◽  
pp. 372-380
Author(s):  
Aries Suharso ◽  
Jejen Zaelani ◽  
Didi Juardi

In the field of information technology, data communication is closely related to file delivery. The size of the file is sometimes a constraint in the delivery process. Large files will take longer delivery times compared to files with smaller sizes. Therefore, to handle the problem, one of them by means of compression. This study uses an experimental method with a waterfall development model with analysis, design, coding, and testing. This application applies the Ziv Welch Lempel (LZW) algorithm. The Ziv Welch Lemp Algorithm (LZW) is included in the lossless compression technique, which is a compression technique that does not change the original data. The result of a compression assessment used the Lempel Ziv Welch (LZW) algorithm shows the average rate of compression ratio and for all types of text files by 51.04% with an average of 2.56 seconds, for an image file type of 37.26% with an average time of 0.44 seconds. Based on the average percentage of the compression ratio for all file types tested using the LZW algorithm (Lemp Ziv Welch) is 40.40% with an average time required is 1.81 seconds.


Sign in / Sign up

Export Citation Format

Share Document