A Computationally Efficient Switching Based Lossless Compression Algorithm for Natural Images

2012 ◽  
Vol 433-440 ◽  
pp. 6540-6545
Author(s):  
Vineet Khanna ◽  
Hari Singh Choudhary ◽  
Prakrati Trivedi

This paper presents a new image lossless compression technique for natural images. The proposed algorithm uses the switching of existing Edge Directed Prediction algorithm and gradient Adaptive Predictor (GAP ) methods. The proposed algorithm is a switching based algorithm and the criteria of switching are based upon the adaptive threshold. We know that EDP has higher computational complexity due to the estimation of LS (Least Square ) based paramter whereas GAP has relatively lower computational complexity. So, in order to reduce the computational complexity we had made a hybrid combination of EDP and GAP Method which the proposed algorithm is a generic algorithm and it produces the best results in different varieties of images in terms of both compression ratio and computational complexity.

The growth of cloud based remote healthcare and diagnosis services has resulted, Medical Service Providers (MSP) to share diagnositic data across diverse environement. This medical data are accessed across diverse platforms, such as, mobile and web services which needs huge memory for storage. Compression technique helps to address and solve storage requirements and provides for sharing medical data over transmission medium. Loss of data is not acceptable for medical image processing. As a result, this work considers lossless compression for medical in particular and in general any greyscale images. Modified Huffman encoding (MH) is one of the widely used technique for achieving lossless compression. However, due to longer bit length of codewords the existing Modified Huffman (MH) encoding technique is not efficient for medical imaging processing. Firstly, this work presents Modified Refined Huffman (MRH) for performing compression of greyscale and binary images by using diagonal scanning method. Secondly, to minimize the computing time parallel encoding method is used. Experiments are conducted for wide variety of images and performance is evaluated in terms of Compression Ratio, Computation Time and Memory Utilization. The proposed MRH achieves significant performance improvement in terms of Compression Ratio, Computation Time and Memory Usage over its state-of-the-art techniques, such as, LZW, CCITT G4, JBIG2 and Levenberg–Marquardt (LM) Neural Network algorithm. The overall results achieved show the applicability of MRH for different application services.


A massive volume of medical data is generating through advanced medical image modalities. With advancements in telecommunications, Telemedicine, and Teleradiologyy have become the most common and viable methods for effective health care delivery around the globe. For sufficient storage, medical images should be compressed using lossless compression techniques. In this paper, we aim at developing a lossless compression technique to achieve a better compression ratio with reversible data hiding. The proposed work segments foreground and background area in medical images using semantic segmentation with the Hierarchical Neural Architecture Search (HNAS) Network model. After segmenting the medical image, confidential patient data is hidden in the foreground area using the parity check method. Following data hiding, lossless compression of foreground and background is done using Huffman and Lempel-Ziv-Welch methods. The performance of our proposed method has been compared with those obtained from standard lossless compression algorithms and existing reversible data hiding methods. This proposed method achieves better compression ratio and a hundred percent reversible when data extraction.


Author(s):  
Restu Maulunida ◽  
Achmad Solichin

At present, the need to access the data have been transformed into digital data, and its use has been growing very rapidly. This transformation is due to the use of the Internet is growing very rapidly, and also the development of mobile devices are growing massively. People tend to store a lot of files in their storage and transfer files from one media to another media. When approaching the limit of storage media, the fewer files that can be stored. A compression technique is required to reduce the size of a file. The dictionary coding technique is one of the lossless compression techniques, LZW is an algorithm for applying coding dictionary compression techniques. In the LZW algorithm, the process of forming a dictionary uses a future based dictionary and encoding process using the Fixed Length Code. It allows the encoding process to produce a sequence that is still quite long. This study will modify the process of forming a dictionary and use Variable Length Code, to optimize the compression ratio. Based on the test using the data used in this study, the average compression ratio for LZW algorithm is 42,85%, and our proposed algorithm is 38,35%. It proves that the modification of the formation of the dictionary we proposed has not been able to improve the compression ratio of the LZW algorithm.


Author(s):  
Emy Setyaningsih ◽  
Agus Harjoko

A compression process is to reduce or compress the size of data while maintaining the quality of information contained therein. This paper presents a survey of research papers discussing improvement of various hybrid compression techniques during the last decade. A hybrid compression technique is a technique combining excellent properties of each group of methods as is performed in JPEG compression method. This technique combines lossy and lossless compression method to obtain a high-quality compression ratio while maintaining the quality of the reconstructed image. Lossy compression technique produces a relatively high compression ratio, whereas lossless compression brings about high-quality data reconstruction as the data can later be decompressed with the same results as before the compression. Discussions of the knowledge of and issues about the ongoing hybrid compression technique development indicate the possibility of conducting further researches to improve the performance of image compression method.


Author(s):  
B. J.S. Sadiq ◽  
V. Yu. Tsviatkou ◽  
M. N. Bobov

The aim of this work is to reduce the computational complexity of lossless compression in the spatial domain due to the combined coding (arithmetic and Run-Length Encoding) of a series of bits of bit planes. Known effective compression encoders separately encode the bit planes of the image or transform coefficients, which leads to an increase in computational complexity due to multiple processing of each pixel. The paper proposes the rules for combined coding and combined encoders for bit planes of pixel differences of images with a tunable and constant structure, which have lower computational complexity and the same compression ratio as compared to an arithmetic encoder of bit planes.


Author(s):  
Hikka Sartika ◽  
Taronisokhi Zebua

Storage space required by an application is one of the problems on smartphones. This problem can result in a waste of storage space because not all smartphones have a very large storage capacity. One application that has a large file size is the RPUL application and this application is widely accessed by students and the general public. Large file size is what often causes this application can not run effectively on smartphones. One solution that can be used to solve this problem is to compress the application file, so that the size of the storage space needed in the smartphone is much smaller. This study describes how the application of the elias gamma code algorithm as one of the compression technique algorithms to compress the RPUL application database file. This is done so that the RPUL application can run effectively on a smartphone after it is installed. Based on trials conducted on 64 bit of text as samples in this research it was found that compression based on the elias gamma code algorithm is able to compress text from a database file with a ratio of compression is 2 bits, compression ratio is 50% with a redundancy is 50%. Keywords: Compression, RPUL, Smartphone, Elias Gamma Code


Author(s):  
Winda Winda ◽  
Taronisokhi Zebua

The size of the data that is owned by an application today is very influential on the amount of space in the memory needed one of which is a mobile-based application. One mobile application that is widely used by students and the public at this time is the Complete Natural Knowledge Summary (Rangkuman Pengetahuan Alam Lengkap or RPAL) application. The RPAL application requires a large amount of material storage space in the mobile memory after it has been installed, so it can cause this application to be ineffective (slow). Compression of data can be used as a solution to reduce the size of the data so as to minimize the need for space in memory. The levestein algorithm is a compression technique algorithm that can be used to compress material stored in the RPAL application database, so that the database size is small. This study describes how to compress the RPAL application database records, so as to minimize the space needed on memory. Based on tests conducted on 128 characters of data (200 bits), the compression results obtained of 136 bits (17 characters) with a compression ratio is 68% and redundancy is 32%.Keywords: compression, levestein, aplication, RPAL, text, database, mobile


2018 ◽  
Vol 8 (9) ◽  
pp. 1471 ◽  
Author(s):  
Seo-Joon Lee ◽  
Gyoun-Yon Cho ◽  
Fumiaki Ikeno ◽  
Tae-Ro Lee

Due to the development of high-throughput DNA sequencing technology, genome-sequencing costs have been significantly reduced, which has led to a number of revolutionary advances in the genetics industry. However, the problem is that compared to the decrease in time and cost needed for DNA sequencing, the management of such large volumes of data is still an issue. Therefore, this research proposes Blockchain Applied FASTQ and FASTA Lossless Compression (BAQALC), a lossless compression algorithm that allows for the efficient transmission and storage of the immense amounts of DNA sequence data that are being generated by Next Generation Sequencing (NGS). Also, security and reliability issues exist in public sequence databases. For methods, compression ratio comparisons were determined for genetic biomarkers corresponding to the five diseases with the highest mortality rates according to the World Health Organization. The results showed an average compression ratio of approximately 12 for all the genetic datasets used. BAQALC performed especially well for lung cancer genetic markers, with a compression ratio of 17.02. BAQALC performed not only comparatively higher than widely used compression algorithms, but also higher than algorithms described in previously published research. The proposed solution is envisioned to contribute to providing an efficient and secure transmission and storage platform for next-generation medical informatics based on smart devices for both researchers and healthcare users.


Sign in / Sign up

Export Citation Format

Share Document