scholarly journals Information Theory Based Evaluation of the RC4 Stream Cipher Outputs

Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 896
Author(s):  
Evaristo José Madarro-Capó ◽  
Carlos Miguel Legón-Pérez ◽  
Omar Rojas ◽  
Guillermo Sosa-Gómez

This paper presents a criterion, based on information theory, to measure the amount of average information provided by the sequences of outputs of the RC4 on the internal state. The test statistic used is the sum of the maximum plausible estimates of the entropies H(jt|zt), corresponding to the probability distributions P(jt|zt) of the sequences of random variables (jt)t∈T and (zt)t∈T, independent, but not identically distributed, where zt are the known values of the outputs, while jt is one of the unknown elements of the internal state of the RC4. It is experimentally demonstrated that the test statistic allows for determining the most vulnerable RC4 outputs, and it is proposed to be used as a vulnerability metric for each RC4 output sequence concerning the iterative probabilistic attack.

Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


2007 ◽  
Vol 177 (7) ◽  
pp. 1715-1727 ◽  
Author(s):  
Violeta Tomašević ◽  
Slobodan Bojanić ◽  
Octavio Nieto-Taladriz

Author(s):  
RONALD R. YAGER

We look at the issue of obtaining a variance like measure associated with probability distributions over ordinal sets. We call these dissonance measures. We specify some general properties desired in these dissonance measures. The centrality of the cumulative distribution function in formulating the concept of dissonance is pointed out. We introduce some specific examples of measures of dissonance.


Sign in / Sign up

Export Citation Format

Share Document