A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information
Keyword(s):
For any memoryless communication channel with a binary-valued input and a one-dimensional real-valued output, we introduce a probabilistic lower bound on the mutual information given empirical observations on the channel. The bound is built on the Dvoretzky-Kiefer-Wolfowitz inequality and is distribution free. A quadratic time algorithm is described for computing the bound and its corresponding class-conditional distribution functions. We compare our approach to existing techniques and show the superiority of our bound to a method inspired by Fano’s inequality where the continuous random variable is discretized.
1963 ◽
Vol 58
(304)
◽
pp. 943-976
◽
1971 ◽
Vol 8
(03)
◽
pp. 431-453
◽
Keyword(s):
2009 ◽
Vol 12
◽
pp. S19-S49
◽
Keyword(s):