High-throughput Von Neumann post-processing for random number generator

Author(s):  
Ruilin Zhang ◽  
Sijia Chen ◽  
Chao Wan ◽  
Hirofumi Shinohara
2018 ◽  
Vol 27 (06) ◽  
pp. 1850095
Author(s):  
Chenyang Guo ◽  
Yujie Zhou

In this paper, a new method is proposed for randomness enhancement. The approach is called the dynamic equilibrium algorithm (DEA). It is used to solve the problems existing in the true random number generator (TRNG). First, the advantages and defects of LFSR as a post-processing module are discussed. When sampling 1000 groups of data, only 517 groups can pass all 15 tests in SP800-22 with a pass rate of 0.981. DEA is actually a great solution to this problem. The essence of DEA is to guarantee the approximately uniform distribution of the overlapping template to improve the bit-entropy by the compression of the data. This method is easy to implement in both software and hardware. The pass rate increases more than 40% with a low compression rate.


2020 ◽  
Vol 28 (10) ◽  
pp. 2171-2181
Author(s):  
Naoya Onizawa ◽  
Shogo Mukaida ◽  
Akira Tamakoshi ◽  
Hitoshi Yamagata ◽  
Hiroyuki Fujita ◽  
...  

2019 ◽  
Vol 2019 ◽  
pp. 1-11
Author(s):  
Hojoong Park ◽  
Yongjin Yeom ◽  
Ju-Sung Kang

We propose a new lightweight BCH code corrector of the random number generator such that the bitwise dependence of the output value is controllable. The proposed corrector is applicable to a lightweight environment and the degree of dependence among the output bits of the corrector is adjustable depending on the bias of the input bits. Hitherto, most correctors using a linear code are studied on the direction of reducing the bias among the output bits, where the biased input bits are independent. On the other hand, the output bits of a linear code corrector are inherently not independent even though the input bits are independent. However, there are no results dealing with the independence of the output bits. The well-known von Neumann corrector has an inefficient compression rate and the length of output bits is nondeterministic. Since the heavy cryptographic algorithms are used in the NIST’s conditioning component to reduce the bias of input bits, it is not appropriate in a lightweight environment. Thus we have concentrated on the linear code corrector and obtained the lightweight BCH code corrector with measurable dependence among the output bits as well as the bias. Moreover, we provide some simulations to examine our results.


2017 ◽  
Vol 88 (9) ◽  
pp. 096105 ◽  
Author(s):  
Yi Qian ◽  
Futian Liang ◽  
Xinzhe Wang ◽  
Feng Li ◽  
Lian Chen ◽  
...  

2015 ◽  
Vol 61 (2) ◽  
pp. 199-204 ◽  
Author(s):  
Szymon Łoza ◽  
Łukasz Matuszewski ◽  
Mieczysław Jessa

Abstract Today, cryptographic security depends primarily on having strong keys and keeping them secret. The keys should be produced by a reliable and robust to external manipulations generators of random numbers. To hamper different attacks, the generators should be implemented in the same chip as a cryptographic system using random numbers. It forces a designer to create a random number generator purely digitally. Unfortunately, the obtained sequences are biased and do not pass many statistical tests. Therefore an output of the random number generator has to be subjected to a transformation called post-processing. In this paper the hash function SHA-256 as post-processing of bits produced by a combined random bit generator using jitter observed in ring oscillators (ROs) is proposed. All components – the random number generator and the SHA-256, are implemented in a single Field Programmable Gate Array (FPGA). We expect that the proposed solution, implemented in the same FPGA together with a cryptographic system, is more attack-resistant owing to many sources of randomness with significantly different nominal frequencies.


Sign in / Sign up

Export Citation Format

Share Document