The derivation of mutual information and covariance function using centered random variables

Author(s):  
Fatimah Abdul Razak
1999 ◽  
Vol 36 (4) ◽  
pp. 1031-1044 ◽  
Author(s):  
Hwai-Chung Ho ◽  
William P. McCormick

Let {Xn, n ≥ 0} be a stationary Gaussian sequence of standard normal random variables with covariance function r(n) = EX0Xn. Let Under some mild regularity conditions on r(n) and the condition that r(n)lnn = o(1) or (r(n)lnn)−1 = O(1), the asymptotic distribution of is obtained. Continuous-time results are also presented as well as a tube formula tail area approximation to the joint distribution of the sum and maximum.


2015 ◽  
Vol 2015 ◽  
pp. 1-12 ◽  
Author(s):  
Guoping Zeng

There are various definitions of mutual information. Essentially, these definitions can be divided into two classes: (1) definitions with random variables and (2) definitions with ensembles. However, there are some mathematical flaws in these definitions. For instance, Class 1 definitions either neglect the probability spaces or assume the two random variables have the same probability space. Class 2 definitions redefine marginal probabilities from the joint probabilities. In fact, the marginal probabilities are given from the ensembles and should not be redefined from the joint probabilities. Both Class 1 and Class 2 definitions assume a joint distribution exists. Yet, they all ignore an important fact that the joint or the joint probability measure is not unique. In this paper, we first present a new unified definition of mutual information to cover all the various definitions and to fix their mathematical flaws. Our idea is to define the joint distribution of two random variables by taking the marginal probabilities into consideration. Next, we establish some properties of the newly defined mutual information. We then propose a method to calculate mutual information in machine learning. Finally, we apply our newly defined mutual information to credit scoring.


2017 ◽  
Vol 25 (1) ◽  
pp. 1-10 ◽  
Author(s):  
Tetiana O. Ianevych ◽  
Yuriy V. Kozachenko ◽  
Viktor B. Troshki

AbstractIn this paper we have constructed the goodness-of-fit tests incorporating several components, like expectation and covariance function for identification of a non-centered univariate random sequence or auto-covariances and cross-covariances for identification of a centered multivariate random sequence. For the construction of the corresponding estimators and investigation of their properties we utilized the theory of square Gaussian random variables.


1999 ◽  
Vol 36 (04) ◽  
pp. 1031-1044 ◽  
Author(s):  
Hwai-Chung Ho ◽  
William P. McCormick

Let {X n , n ≥ 0} be a stationary Gaussian sequence of standard normal random variables with covariance function r(n) = E X 0 X n . Let Under some mild regularity conditions on r(n) and the condition that r(n)lnn = o(1) or (r(n)lnn)−1 = O(1), the asymptotic distribution of is obtained. Continuous-time results are also presented as well as a tube formula tail area approximation to the joint distribution of the sum and maximum.


2018 ◽  
Vol 115 (40) ◽  
pp. 9956-9961 ◽  
Author(s):  
Xianli Zeng ◽  
Yingcun Xia ◽  
Howell Tong

Quantifying the dependence between two random variables is a fundamental issue in data analysis, and thus many measures have been proposed. Recent studies have focused on the renowned mutual information (MI) [Reshef DN, et al. (2011)Science334:1518–1524]. However, “Unfortunately, reliably estimating mutual information from finite continuous data remains a significant and unresolved problem” [Kinney JB, Atwal GS (2014)Proc Natl Acad Sci USA111:3354–3359]. In this paper, we examine the kernel estimation of MI and show that the bandwidths involved should be equalized. We consider a jackknife version of the kernel estimate with equalized bandwidth and allow the bandwidth to vary over an interval. We estimate the MI by the largest value among these kernel estimates and establish the associated theoretical underpinnings.


2019 ◽  
Vol 148 ◽  
pp. 9-16 ◽  
Author(s):  
Aleksandr Beknazaryan ◽  
Xin Dang ◽  
Hailin Sang

Sign in / Sign up

Export Citation Format

Share Document