A computationally efficient estimator for mutual information
2008 ◽
Vol 464
(2093)
◽
pp. 1203-1215
◽
Keyword(s):
Mutual information quantifies the determinism that exists in a relationship between random variables, and thus plays an important role in exploratory data analysis. We investigate a class of non-parametric estimators for mutual information, based on the nearest neighbour structure of observations in both the joint and marginal spaces. Unless both marginal spaces are one-dimensional, we demonstrate that a well-known estimator of this type can be computationally expensive under certain conditions, and propose a computationally efficient alternative that has a time complexity of order ( N log N ) as the number of observations N →∞.
Keyword(s):
Keyword(s):