scholarly journals Encryption of Stereo Images after Compression by Advanced Encryption Standard (AES)

2018 ◽  
Vol 28 (2) ◽  
pp. 156 ◽  
Author(s):  
Marwah K Hussien

New partial encryption schemes are proposed, in which a secure encryption algorithm is used to encrypt only part of the compressed data. Partial encryption applied after application of image compression algorithm. Only 0.0244%-25% of the original data isencrypted for two pairs of dif-ferent grayscale imageswiththe size (256 ´ 256) pixels. As a result, we see a significant reduction of time in the stage of encryption and decryption. In the compression step, the Orthogonal Search Algorithm (OSA) for motion estimation (the dif-ferent between stereo images) is used. The resulting disparity vector and the remaining image were compressed by Discrete Cosine Transform (DCT), Quantization and arithmetic encoding. The image compressed was encrypted by Advanced Encryption Standard (AES). The images were then decoded and were compared with the original images. Experimental results showed good results in terms of Peak Signal-to-Noise Ratio (PSNR), Com-pression Ratio (CR) and processing time. The proposed partial encryption schemes are fast, se-cure and do not reduce the compression performance of the underlying selected compression methods

2012 ◽  
Vol 8 (8) ◽  
pp. 1-11
Author(s):  
Hameed Younis ◽  
Abdulkareem Abdalla ◽  
Turki Abdalla

Cryptography is one of the technological means to provide security to data being transmitted on information and communication systems. When it is necessary to securely transmit data in limited bandwidth, both compression and encryption must be performed. Researchers have combined compression and encryption together to reduce the overall processing time. In this paper, new partial encryption schemes are proposed to encrypt only part of the compressed image. Soft and hard threshold compression methods are used in the compression step and the Advanced Encryption Standard (AES) cipher is used for the encryption step. The effect of different threshold values on the performance of the proposed schemes are studied. The proposed partial encryption schemes are fast, secure, and do not reduce the compression performance of the underlying selected compression methods.


Algorithms ◽  
2019 ◽  
Vol 12 (4) ◽  
pp. 78
Author(s):  
Muhammed Oğuzhan Külekci ◽  
Yasin Öztürk

Non-uniquely-decodable (non-UD) codes can be defined as the codes that cannot be uniquely decoded without additional disambiguation information. These are mainly the class of non–prefix–free codes, where a code-word can be a prefix of other(s), and thus, the code-word boundary information is essential for correct decoding. Due to their inherent unique decodability problem, such non-UD codes have not received much attention except a few studies, in which using compressed data structures to represent the disambiguation information efficiently had been previously proposed. It had been shown before that the compression ratio can get quite close to Huffman/Arithmetic codes with an additional capability of providing direct access in compressed data, which is a missing feature in the regular Huffman codes. In this study we investigate non-UD codes in another dimension addressing the privacy of the high-entropy data. We particularly focus on such massive volumes, where typical examples are encoded video or similar multimedia files. Representation of such a volume with non–UD coding creates two elements as the disambiguation information and the payload, where decoding the original data from these elements becomes hard when one of them is missing. We make use of this observation for privacy concerns. and study the space consumption as well as the hardness of that decoding. We conclude that non-uniquely-decodable codes can be an alternative to selective encryption schemes that aim to secure only part of the data when data is huge. We provide a freely available software implementation of the proposed scheme as well.


Entropy ◽  
2021 ◽  
Vol 23 (5) ◽  
pp. 535
Author(s):  
Karim H. Moussa ◽  
Ahmed I. El Naggary ◽  
Heba G. Mohamed

Multimedia wireless communications have rapidly developed over the years. Accordingly, an increasing demand for more secured media transmission is required to protect multimedia contents. Image encryption schemes have been proposed over the years, but the most secure and reliable schemes are those based on chaotic maps, due to the intrinsic features in such kinds of multimedia contents regarding the pixels’ high correlation and data handling capabilities. The novel proposed encryption algorithm introduced in this article is based on a 3D hopping chaotic map instead of fixed chaotic logistic maps. The non-linearity behavior of the proposed algorithm, in terms of both position permutation and value transformation, results in a more secured encryption algorithm due to its non-convergence, non-periodicity, and sensitivity to the applied initial conditions. Several statistical and analytical tests such as entropy, correlation, key sensitivity, key space, peak signal-to-noise ratio, noise attacks, number of pixels changing rate (NPCR), unified average change intensity randomness (UACI), and others tests were applied to measure the strength of the proposed encryption scheme. The obtained results prove that the proposed scheme is very robust against different cryptography attacks compared to similar encryption schemes.


2021 ◽  
Vol 21 (1) ◽  
pp. 1-20
Author(s):  
A. K. Singh ◽  
S. Thakur ◽  
Alireza Jolfaei ◽  
Gautam Srivastava ◽  
MD. Elhoseny ◽  
...  

Recently, due to the increase in popularity of the Internet, the problem of digital data security over the Internet is increasing at a phenomenal rate. Watermarking is used for various notable applications to secure digital data from unauthorized individuals. To achieve this, in this article, we propose a joint encryption then-compression based watermarking technique for digital document security. This technique offers a tool for confidentiality, copyright protection, and strong compression performance of the system. The proposed method involves three major steps as follows: (1) embedding of multiple watermarks through non-sub-sampled contourlet transform, redundant discrete wavelet transform, and singular value decomposition; (2) encryption and compression via SHA-256 and Lempel Ziv Welch (LZW), respectively; and (3) extraction/recovery of multiple watermarks from the possibly distorted cover image. The performance estimations are carried out on various images at different attacks, and the efficiency of the system is determined in terms of peak signal-to-noise ratio (PSNR) and normalized correlation (NC), structural similarity index measure (SSIM), number of changing pixel rate (NPCR), unified averaged changed intensity (UACI), and compression ratio (CR). Furthermore, the comparative analysis of the proposed system with similar schemes indicates its superiority to them.


Author(s):  
VINCENT T. WOOD ◽  
ROBERT P. DAVIES-JONES ◽  
ALAN SHAPIRO

AbstractSingle-Doppler radar data are often missing in important regions of a severe storm due to low return power, low signal-to-noise ratio, ground clutter associated with normal and anomalous propagation, and missing radials associated with partial or total beam blockage. Missing data impact the ability of WSR-88D algorithms to detect severe weather. To aid the algorithms, we develop a variational technique that fills in Doppler velocity data voids smoothly by minimizing Doppler velocity gradients while not modifying good data. This method provides estimates of the analysed variable in data voids without creating extrema.Actual single-Doppler radar data of four tornadoes are used to demonstrate the variational algorithm. In two cases, data are missing in the original data, and in the other two, data are voided artificially. The filled-in data match the voided data well in smoothly varying Doppler velocity fields. Near singularities such as tornadic vortex signatures, the match is poor as anticipated. The algorithm does not create any velocity peaks in the former data voids, thus preventing false triggering of tornado warnings. Doppler circulation is used herein as a far-field tornado detection and advance-warning parameter. In almost all cases, the measured circulation is quite insensitive to the data that have been voided and then filled. The tornado threat is still apparent.


2014 ◽  
Vol 10 (4) ◽  
pp. 221 ◽  
Author(s):  
Mokhtar Ouamri ◽  
Kamel M. Faraoun

Emerging High efficiency video coding (HEVC) is expected to be widely adopted in network applications for high definition devices and mobile terminals. Thus, construction of HEVC's encryption schemes that maintain format compliance and bit rate of encrypted bitstream becomes an active security's researches area. This paper presents a novel selective encryption technique for HEVC videos, based on enciphering the bins of selected Golomb–Rice code’s suffixes with the Advanced Encryption Standard (AES) in a CBC operating mode. The scheme preserves format compliance and size of the encrypted HEVC bitstream, and provides high visual degradation with optimized encryption space defined by selected Golomb–Rice suffixes. Experimental results show reliability and robustness of the proposed technique.


Author(s):  
Erwin Erwin ◽  
Saparudin Saparudin ◽  
Wulandari Saputri

This paper proposes a new method for image segmentation is hybrid multilevel thresholding and improved harmony search algorithm. Improved harmony search algorithm which is a method for finding vector solutions by increasing its accuracy. The proposed method looks for a random candidate solution, then its quality is evaluated through the Otsu objective function. Furthermore, the operator continues to evolve the solution candidate circuit until the optimal solution is found. The dataset used in this study is the retina dataset, tongue, lenna, baboon, and cameraman. The experimental results show that this method produces the high performance as seen from peak signal-to-noise ratio analysis (PNSR). The PNSR result for retinal image averaged 40.342 dB while for the average tongue image 35.340 dB. For lenna, baboon and cameramen produce an average of 33.781 dB, 33.499 dB, and 34.869 dB. Furthermore, the process of object recognition and identification is expected to use this method to produce a high degree of accuracy.


2017 ◽  
Vol 21 (2) ◽  
Author(s):  
Edgar Garcia ◽  
Ivan Amaya ◽  
Rodrigo Correa

<p class="MsoNormal"><span lang="EN-US">This work considers the prediction in real time of physicochemical parameters of a sample heated in a uniform electromagnetic field. The thermal conductivity (K)</span><!--[if gte msEquation 12]><m:oMath><i style='mso-bidi-font-style:normal'><span lang=EN-US style='font-family:"Cambria Math","serif"'><m:r>(</m:r><m:r>K</m:r><m:r>) </m:r></span></i></m:oMath><![endif]--><!--[if !msEquation]--><!--[endif]--><span lang="EN-US">and the </span><span lang="EN">combination of density and heat capacity terms (pc)</span><span lang="EN"> were estimated as a demonstrative example.</span><span lang="EN-US">The sample (with known geometry) was subjected to electromagnetic radiation, generating a uniform and time constant volumetric heat flow within it. Real temperature profile was simulated adding white Gaussian noise to the original data, obtained from the theoretical model. For solving the objective function, simulated annealing and genetic algorithms, along with the traditional Levenberg-Marquardt method were used for comparative purposes. Results show similar findings of all algorithms for three simulation scenarios, as long as the signal to noise ratio sits at least at 30 dB. It means for practical purposes, that the estimation procedure presented here requires both, a good experimental design and an electronic instrumentation correctly specified.</span><span lang="EN-US">If both requirements are satisfied simultaneously, it is possible to estimate these type of parameters on-line, without need for an additional experimental setup.</span></p><p class="MsoNormal"><span lang="EN-US">This work considers the prediction in real time of physicochemical parameters of a sample heated in a uniform electromagnetic field. The thermal conductivity </span><!--[if gte msEquation 12]><m:oMath><i style='mso-bidi-font-style:normal'><span lang=EN-US style='font-family:"Cambria Math","serif"'><m:r>(</m:r><m:r>K</m:r><m:r>) </m:r></span></i></m:oMath><![endif]--><!--[if !msEquation]--><!--[endif]--><span lang="EN-US">and the </span><span lang="EN">combination of density and heat capacity terms (</span><!--[if gte msEquation 12]><m:oMath><i style='mso-bidi-font-style:normal'><span lang=EN style='font-family:"Cambria Math","serif"; mso-ansi-language:EN'><m:r>ρc</m:r><m:r>)</m:r></span></i></m:oMath><![endif]--><!--[if !msEquation]--><!--[endif]--><span lang="EN"> were estimated as a demonstrative example.</span><span lang="EN-US">The sample (with known geometry) was subjected to electromagnetic radiation, generating a uniform and time constant volumetric heat flow within it. Real temperature profile was simulated adding white Gaussian noise to the original data, obtained from the theoretical model. For solving the objective function, simulated annealing and genetic algorithms, along with the traditional Levenberg-Marquardt method were used for comparative purposes. Results show similar findings of all algorithms for three simulation scenarios, as long as the signal to noise ratio sits at least at 30 dB. It means for practical purposes, that the estimation procedure presented here requires both, a good experimental design and an electronic instrumentation correctly specified.</span><span lang="EN-US">If both requirements are satisfied simultaneously, it is possible to estimate these type of parameters on-line, without need for an additional experimental setup.</span></p>


2019 ◽  
Vol 29 (1) ◽  
pp. 1441-1452 ◽  
Author(s):  
G.K. Shailaja ◽  
C.V. Guru Rao

Abstract Privacy-preserving data mining (PPDM) is a novel approach that has emerged in the market to take care of privacy issues. The intention of PPDM is to build up data-mining techniques without raising the risk of mishandling of the data exploited to generate those schemes. The conventional works include numerous techniques, most of which employ some form of transformation on the original data to guarantee privacy preservation. However, these schemes are quite multifaceted and memory intensive, thus leading to restricted exploitation of these methods. Hence, this paper intends to develop a novel PPDM technique, which involves two phases, namely, data sanitization and data restoration. Initially, the association rules are extracted from the database before proceeding with the two phases. In both the sanitization and restoration processes, key extraction plays a major role, which is selected optimally using Opposition Intensity-based Cuckoo Search Algorithm, which is the modified format of Cuckoo Search Algorithm. Here, four research issues, such as hiding failure rate, information preservation rate, and false rule generation, and degree of modification are minimized using the adopted sanitization and restoration processes.


2013 ◽  
Vol 827 ◽  
pp. 209-212
Author(s):  
Xiao Li Yang ◽  
Fan Wang ◽  
Wen Chao Wang ◽  
Yun Xiu Chen ◽  
Ji Shu Chen

We studied moisture determination in bituminous coal and lignitic coal samples using near-infrared (NIR) spectra. This research was developed by applying partial least squares regression (PLS) and discrete wavelet transform (DWT). Firstly, the NIR spectra were pre-processed by DWT for fitting and compression. Then, the compressed data were used to build regression model with PLS for moisture determination in coal samples. Compression performance at different resolution scales was investigated. Using the compressed data, PLS can obtain more accurate result than using raw spectra. The number of principal component in PLS model was investigated too. The results show DWT-PLS can obtain satisfactory determination performance for moisture analysis in bituminous coal and lignitic coal.


Sign in / Sign up

Export Citation Format

Share Document