scholarly journals A Lightweight Huffman-based Differential Encoding Lossless Compression Technique in IoT for Smart Agriculture

2022 ◽  
Vol 11 (1) ◽  
pp. 117-127
Author(s):  
Ali Kadhum M. Al-Qurabat
2019 ◽  
Vol 11 (21) ◽  
pp. 2461 ◽  
Author(s):  
Kevin Chow ◽  
Dion Tzamarias ◽  
Ian Blanes ◽  
Joan Serra-Sagristà

This paper proposes a lossless coder for real-time processing and compression of hyperspectral images. After applying either a predictor or a differential encoder to reduce the bit rate of an image by exploiting the close similarity in pixels between neighboring bands, it uses a compact data structure called k 2 -raster to further reduce the bit rate. The advantage of using such a data structure is its compactness, with a size that is comparable to that produced by some classical compression algorithms and yet still providing direct access to its content for query without any need for full decompression. Experiments show that using k 2 -raster alone already achieves much lower rates (up to 55% reduction), and with preprocessing, the rates are further reduced up to 64%. Finally, we provide experimental results that show that the predictor is able to produce higher rates reduction than differential encoding.


2012 ◽  
Vol 433-440 ◽  
pp. 6540-6545
Author(s):  
Vineet Khanna ◽  
Hari Singh Choudhary ◽  
Prakrati Trivedi

This paper presents a new image lossless compression technique for natural images. The proposed algorithm uses the switching of existing Edge Directed Prediction algorithm and gradient Adaptive Predictor (GAP ) methods. The proposed algorithm is a switching based algorithm and the criteria of switching are based upon the adaptive threshold. We know that EDP has higher computational complexity due to the estimation of LS (Least Square ) based paramter whereas GAP has relatively lower computational complexity. So, in order to reduce the computational complexity we had made a hybrid combination of EDP and GAP Method which the proposed algorithm is a generic algorithm and it produces the best results in different varieties of images in terms of both compression ratio and computational complexity.


The growth of cloud based remote healthcare and diagnosis services has resulted, Medical Service Providers (MSP) to share diagnositic data across diverse environement. This medical data are accessed across diverse platforms, such as, mobile and web services which needs huge memory for storage. Compression technique helps to address and solve storage requirements and provides for sharing medical data over transmission medium. Loss of data is not acceptable for medical image processing. As a result, this work considers lossless compression for medical in particular and in general any greyscale images. Modified Huffman encoding (MH) is one of the widely used technique for achieving lossless compression. However, due to longer bit length of codewords the existing Modified Huffman (MH) encoding technique is not efficient for medical imaging processing. Firstly, this work presents Modified Refined Huffman (MRH) for performing compression of greyscale and binary images by using diagonal scanning method. Secondly, to minimize the computing time parallel encoding method is used. Experiments are conducted for wide variety of images and performance is evaluated in terms of Compression Ratio, Computation Time and Memory Utilization. The proposed MRH achieves significant performance improvement in terms of Compression Ratio, Computation Time and Memory Usage over its state-of-the-art techniques, such as, LZW, CCITT G4, JBIG2 and Levenberg–Marquardt (LM) Neural Network algorithm. The overall results achieved show the applicability of MRH for different application services.


A massive volume of medical data is generating through advanced medical image modalities. With advancements in telecommunications, Telemedicine, and Teleradiologyy have become the most common and viable methods for effective health care delivery around the globe. For sufficient storage, medical images should be compressed using lossless compression techniques. In this paper, we aim at developing a lossless compression technique to achieve a better compression ratio with reversible data hiding. The proposed work segments foreground and background area in medical images using semantic segmentation with the Hierarchical Neural Architecture Search (HNAS) Network model. After segmenting the medical image, confidential patient data is hidden in the foreground area using the parity check method. Following data hiding, lossless compression of foreground and background is done using Huffman and Lempel-Ziv-Welch methods. The performance of our proposed method has been compared with those obtained from standard lossless compression algorithms and existing reversible data hiding methods. This proposed method achieves better compression ratio and a hundred percent reversible when data extraction.


Author(s):  
Restu Maulunida ◽  
Achmad Solichin

At present, the need to access the data have been transformed into digital data, and its use has been growing very rapidly. This transformation is due to the use of the Internet is growing very rapidly, and also the development of mobile devices are growing massively. People tend to store a lot of files in their storage and transfer files from one media to another media. When approaching the limit of storage media, the fewer files that can be stored. A compression technique is required to reduce the size of a file. The dictionary coding technique is one of the lossless compression techniques, LZW is an algorithm for applying coding dictionary compression techniques. In the LZW algorithm, the process of forming a dictionary uses a future based dictionary and encoding process using the Fixed Length Code. It allows the encoding process to produce a sequence that is still quite long. This study will modify the process of forming a dictionary and use Variable Length Code, to optimize the compression ratio. Based on the test using the data used in this study, the average compression ratio for LZW algorithm is 42,85%, and our proposed algorithm is 38,35%. It proves that the modification of the formation of the dictionary we proposed has not been able to improve the compression ratio of the LZW algorithm.


Sign in / Sign up

Export Citation Format

Share Document