scholarly journals Joint Image Compression and Encryption Using IWT with SPIHT, Kd-Tree and Chaotic Maps

2018 ◽  
Vol 8 (10) ◽  
pp. 1963 ◽  
Author(s):  
◽  
Jun Sang ◽  
Muhammad Azeem Akbar ◽  
Bin Cai ◽  
Hong Xiang ◽  
...  

Confidentiality and efficient bandwidth utilization require a combination of compression and encryption of digital images. In this paper, a new method for joint image compression and encryption based on set partitioning in hierarchical trees (SPIHT) with optimized Kd-tree and multiple chaotic maps was proposed. First, the lossless compression and encryption of the original images were performed based on integer wavelet transform (IWT) with SPIHT. Wavelet coefficients undergo diffusions and permutations before encoded through SPIHT. Second, maximum confusion, diffusion and compression of the SPIHT output were performed via the modified Kd-tree, wavelet tree and Huffman coding. Finally, the compressed output was further encrypted with varying parameter logistic maps and modified quadratic chaotic maps. The performance of the proposed technique was evaluated through compression ratio (CR) and peak-signal-to-noise ratio (PSNR), key space and histogram analyses. Moreover, this scheme passes several security tests, such as sensitivity, entropy and differential analysis tests. According to the theoretical analysis and experimental results, the proposed method is more secure and decreases the redundant information of the image more than the existing techniques for hybrid compression and encryption.

Author(s):  
Ali Iqbal ◽  
Imran Touqir ◽  
Asim Ashfaque ◽  
Natasha Khan ◽  
Fahim Ashraf

WT (Wavelet Transform) is considered as landmark for image compression because it represents a signal in terms of functions which are localized both in frequency and time domain. Wavelet sub-band coding exploits the self-similarity of pixels in images and arranges resulting coefficients in different sub-bands. A much simpler and fully embedded codec algorithm SPIHT (Set Partitioning in Hierarchical Trees) is widely used for the compression of wavelet transformed images. It encodes the transformed coefficients depending upon their significance comparative to the given threshold. Statistical analysis reveals that the output bit-stream of SPIHT comprises of long trail of zeroes that can be further compressed, therefore SPIHT is not advocated to be used as sole mean of compression. In this paper, wavelet transformed images have been initially compressed by using SPIHT technique and to attain more compression, the output bit streams of SPIHT are then fed to entropy encoders; Huffman and Arithmetic encoders, for further de-correlation. The comparison of two concatenations has been carried out by evaluating few factors like Bit Saving Capability, PSNR (Peak Signal to Noise Ratio), Compression Ratio and Elapsed Time. The experimental results of these cascading demonstrate that SPIHT combined with Arithmetic coding yields better compression ratio as compared to SPIHT cascaded with Huffman coding. Whereas, SPIHT once combined with Huffman coding is proved to be comparatively efficient.


2020 ◽  
Vol 20 (02) ◽  
pp. 2050008
Author(s):  
S. P. Raja

This paper presents a complete analysis of wavelet-based image compression encoding techniques. The techniques involved in this paper are embedded zerotree wavelet (EZW), set partitioning in hierarchical trees (SPIHT), wavelet difference reduction (WDR), adaptively scanned wavelet difference reduction (ASWDR), set partitioned embedded block coder (SPECK), compression with reversible embedded wavelet (CREW) and spatial orientation tree wavelet (STW). Experiments are done by varying level of the decomposition, bits per pixel and compression ratio. The evaluation is done by taking parameters like peak signal to noise ratio (PSNR), mean square error (MSE), image quality index (IQI) and structural similarity index (SSIM), average difference (AD), normalized cross-correlation (NK), structural content (SC), maximum difference (MD), Laplacian mean squared error (LMSE) and normalized absolute error (NAE).


2021 ◽  
Vol 17 (3) ◽  
pp. 219-234
Author(s):  
Rajamandrapu Srinivas ◽  
N. Mayur

Compression and encryption of images are emerging as recent topics in the area of research to improve the performance of data security. A joint lossless image compression and encryption algorithm based on Integer Wavelet Transform (IWT) and the Hybrid Hyperchaotic system is proposed to enhance the security of data transmission. Initially, IWT is used to compress the digital images and then the encryption is accomplished using the Hybrid Hyperchaotic system. A Hybrid Hyperchaotic system; Fractional Order Hyperchaotic Cellular Neural Network (FOHCNN) and Fractional Order Four-Dimensional Modified Chua’s Circuit (FOFDMCC) is used to generate the pseudorandom sequences. The pixel substitution and scrambling are realized simultaneously using Global Bit Scrambling (GBS) that improves the cipher unpredictability and efficiency. In this study, Deoxyribonucleic Acid (DNA) sequence is adopted instead of a binary operation, which provides high resistance to the cipher image against crop attack and salt-and-pepper noise. It was observed from the simulation outcome that the proposed Hybrid Hyperchaotic system with IWT demonstrated more effective performance in image compression and encryption compared with the existing models in terms of parameters such as unified averaged changed intensity, a number of changing pixels rate, and correlation coefficient.


2014 ◽  
Vol 14 (04) ◽  
pp. 1450020 ◽  
Author(s):  
Ranjan Kumar Senapati ◽  
Prasanth Mankar

In this paper, two simple yet efficient embedded block-based image compression algorithms are presented. These algorithms not only improve the rate distortion performances of set partitioning in hierarchical trees (SPIHT) and set partitioning in embedded block coder (SPECK) at lower bit rates but also reduces the dynamic memory requirement by 91.1% in comparison to SPIHT. The former objective is achieved by better exploiting the coefficient decaying spectrum of the wavelet transformd images and the later objective is realised by improved listless implementation of the algorithms. The proposed algorithms explicitly perform breadth first search like SPECK. Extensive simulation conducted on various standard grayscale and color images indicate significant peak-signal-to-noise-ratio (PSNR) improvement over most of the state-of-the-art wavelet-based embedded coders including JPEG2000 at lower rates. The reduction of encoding and decoding time as well as improvement in coding efficiency at lower bit rates facilitate these coder as better candidates for multimedia applications.


In the recent days, the importance of image compression techniques is exponentially increased due to the generation of massive amount of data which needs to be stored or transmitted. Numerous approaches have been presented for effective image compression by the principle of representing images in its compact form through the avoidance of unnecessary pixels. Vector quantization (VA) is an effective method in image compression and the construction of quantization table is an important process is an important task. The compression performance and the quality of reconstructed data are based on the quantization table, which is actually a matrix of 64 integers. The quantization table selection is a complex combinatorial problem which can be resolved by the evolutionary algorithms (EA). Presently, EA became famous to resolve the real world problems in a reasonable amount of time. This chapter introduces Firefly (FF) with Teaching and learning based optimization (TLBO) algorithm termed as FF-TLBO algorithm for the selection of quantization table and introduces Firefly with Tumbling algorithm termed as FF-Tumbling algorithm for the selection of search space. As the FF algorithm faces a problem when brighter FFs are insignificant, the TLBO algorithm is integrated to it to resolve the problem and Tumbling efficiently train the algorithm to explore all direction in the solution space. This algorithm determines the best fit value for every bock as local best and best fitness value for the entire image is considered as global best. When these values are found by FF algorithm, compression process takes place by efficient image compression algorithm like Run Length Encoding and Huffman coding. The proposed FF-TLBO and FF-Tumbling algorithm is evaluated by comparing its results with existing FF algorithm using a same set of benchmark images in terms of Mean Square Error (MSE), Peak Signal to Noise Ratio (PSNR), Signal to Noise Ratio (SNR). The obtained results ensure the superior performance of FF-TLBO and FF-Tumbling algorithm over FF algorithm and make it highly useful for real time applications.


2014 ◽  
Vol 984-985 ◽  
pp. 1276-1281
Author(s):  
C. Priya ◽  
T. Kesavamurthy ◽  
M. Uma Priya

Recently many new algorithms for image compression based on wavelets have been developed.This paper gives a detailed explanation of SPIHT algorithm with the combination of Lempel Ziv Welch compression technique for image compression by MATLAB implementation. Set partitioning in Hierarchical trees (SPIHT) is one of the most efficient algorithm known today. Pyramid structures have been created by the SPIHT algorithm based on a wavelet decomposition of an image. Lempel Ziv Welch is a universal lossless data compression algorithm guarantees that the original information can be exactly reproduced from the compressed data.The proposed methods have better compression ratio, computational speed and good reconstruction quality of the image. To analysis the proposed lossless methods here, calculate the performance metrics as Compression ratio, Mean square error, Peak signal to Noise ratio. Key Words-LempelZivWelch (LZW),SPIHT,Wavelet


2010 ◽  
Vol 07 (04) ◽  
pp. 309-317
Author(s):  
G. MUTHULAKSHMI ◽  
V. SADASIVAM

The objective of image compression is to reduce the redundant amount of data and to achieve low bit rate without any apparent loss of image quality. In this paper, the compression process is achieved by using wavelet transform with lifting technique for decomposition and tree-structured vector scheme for quantization. Huffman coding method is used for encoding stage. Wavelet technique provides the most promising tool for high-quality image compression. Lifting transform provides a flexible tool for the construction of sub-band decomposition and perfect reconstruction. TSVQ has its advantages over conventional vector quantization by minimizing the computational complexity. Huffman coding is used for efficient coding of images at lower bit rate with minimal loss of information. The performance of the proposed work has been assessed with parameters like bits per pixel (bpp), compression ratio (CR), peak signal to noise ratio (PSNR), codebook size (CS), and mean squared error (MSE). The experimental results show that the proposed work yields a higher compression with better quality reconstructed image than the other conventional compression methods.


Author(s):  
A. Suruliandi ◽  
S. P. Raja

This paper discusses about embedded zerotree wavelet (EZW) and other wavelet-based encoding techniques employed in lossy image compression. The objective of this paper is two fold. Primarily wavelet-based encoding techniques such as EZW, set partitioning in hierarchical trees (SPIHT), wavelet difference reduction (WDR), adaptively scanned wavelet difference reduction (ASWDR), set partitioned embedded block (SPECK), compression with reversible embedded wavelet (CREW) and space frequency quantization (SFQ) are implemented and their performance is analyzed. Second, wavelet-based compression schemes such as Haar, Daubechies and Biorthogonal are used to evaluate the performance of encoding techniques. The performance parameters such as peak signal-to-noise ratio (PSNR) and mean square error (MSE) are used for evaluation purpose. From the results it is observed that the performance of SPIHT encoding technique is providing better results when compared to other encoding schemes.


2017 ◽  
Vol 4 (1) ◽  
pp. 113-126
Author(s):  
Jide Julius Popoola ◽  
Michael Elijah Adekanye

The advent of computer and internet has brought about massive change to the ways images are being managed. This revolution has resulted in changes in image processing and management as well as the huge space requirement for images’ uploading, downloading, transferring and storing nowadays. In guiding against this huge space requirement, images need to be compressed before either storing or transmitting. Several algorithms or techniques on image compression had been developed in literature. In this study, three of these image compression algorithms were developed using MATLAB codes. The three algorithms developed are discrete cosine transform (DCT), discrete wavelet transform (DWT) and set partitioning in hierarchical tree (SPIHT). In order to ascertain which of them is most appropriate for image storing and transmission, comparative performance evaluations were conducted on the three developed algorithms using five performance indices. The results of the comparative performance evaluations show that the three algorithms are effective in image compression but with different efficiency rates. In addition, the comparative performance evaluations results show that DWT has the highest compression ratio and distortion level while the corresponding values for SPIHT is the lowest with those of DCT fall in-between. Also, the results of the study show that the lower the mean square error and the higher the peak signal-to-noise-ratio, the lower the distortion level in the compressed image.


Sign in / Sign up

Export Citation Format

Share Document