scholarly journals Generation of random keys for cryptographic systems

Author(s):  
Mariusz Borowski ◽  
Marek Leśniewicz ◽  
Robert Wicik ◽  
Marcin Grzonkowski
Keyword(s):  
2009 ◽  
Vol 40 (1-6) ◽  
pp. 227-240 ◽  
Author(s):  
J. F. Aguilar Madeira ◽  
H. L. Pina ◽  
H. C. Rodrigues

2017 ◽  
Vol 2017 ◽  
pp. 1-15 ◽  
Author(s):  
M. A. Mohamed ◽  
Ahmed Shaaban Samrah ◽  
Mohamed Ismail Fath Allah

Intensive studies have been done to get robust encryption algorithms. Due to the importance of image information, optical encryption has played a vital role in information security. Many optical encryption schemes have been proposed but most of them suffer from poor robustness. In this paper six proposed algorithms will be presented for optical encryption to be robust to severe attacks: composite attack. Three of these approaches are based on one level Discrete Wavelet Transform (DWT) and the others are based on Wavelet Packet (WP). Not only will new techniques be presented but also a new proposed chaotic map has been developed as random keys for all algorithms. After extensive comparative study with some traditional techniques, it has been found that the novel algorithms have achieved better performance versus conventional ones. Also it has been found that WP based algorithms have achieved better performance than DWT based ones against severe composite attacks.


2021 ◽  
Vol 7 ◽  
pp. e628
Author(s):  
Ravinder Rao Peechara ◽  
Sucharita V

Data exchange over the Internet and other access channels is on the rise, leads to the insecurity of consequences. Many experiments have been conducted to investigate time-efficient and high-randomized encryption methods for the data. The latest studies, however, have still been debated because of different factors. The study outcomes do not yield completely random keys for encryption methods that are longer than this. Prominent repetition makes the processes predictable and susceptible to assaults. Furthermore, recently generated keys need recent algorithms to run at a high volume of transactional data successfully. In this article, the proposed solutions to these two critical issues are presented. In the beginning, one must use the chaotic series of events for generating keys is sufficient to obtain a high degree of randomness. Moreover, this work also proposes a novel and non-traditional validation test to determine the true randomness of the keys produced from a correlation algorithm. An approximate 100% probability of the vital phase over almost infinitely long-time intervals minimizes the algorithms’ complexity for the higher volume of data security. It is suggested that these algorithms are mainly intended for cloud-based transactions. Data volume is potentially higher and extremely changeable 3% to 4% of the improvement in data transmission time with suggested algorithms. This research has the potential to improve communication systems over ten years by unblocking decades-long bottlenecks.


In this paper, we propose data encryption algorithm based on randomized sub blocks XORing of data blocks. The data is divided into number of square blocks (in terms of bits) of equal size. The last block is padded with zeros if required. The proposed algorithm uses pseudo random keys to generate the order of sub blocks of data blocks for encryption. The encryption is done on sub blocks and therefore much faster and highly sensitive for small changes in the keys


2012 ◽  
Vol 12 (5&6) ◽  
pp. 395-403
Author(s):  
Jan Bouda ◽  
Matej Pivoluska ◽  
Martin Plesch

The lack of perfect randomness can cause significant problems in securing communication between two parties. McInnes and Pinkas \cite{McInnesPinkas-ImpossibilityofPrivate-1991} proved that unconditionally secure encryption is impossible when the key is sampled from a weak random source. The adversary can always gain some information about the plaintext, regardless of the cryptosystem design. Most notably, the adversary can obtain full information about the plaintext if he has access to just two bits of information about the source (irrespective on length of the key). In this paper we show that for every weak random source there is a cryptosystem with a classical plaintext, a classical key, and a quantum ciphertext that bounds the adversary's probability $p$ to guess correctly the plaintext strictly under the McInnes-Pinkas bound, except for a single case, where it coincides with the bound. In addition, regardless of the source of randomness, the adversary's probability $p$ is strictly smaller than $1$ as long as there is some uncertainty in the key (Shannon/min-entropy is non-zero). These results are another demonstration that quantum information processing can solve cryptographic tasks with strictly higher security than classical information processing.


Sign in / Sign up

Export Citation Format

Share Document