variational distance
Recently Published Documents


TOTAL DOCUMENTS

29
(FIVE YEARS 2)

H-INDEX

6
(FIVE YEARS 0)

Foundations ◽  
2021 ◽  
Vol 1 (2) ◽  
pp. 256-264
Author(s):  
Takuya Yamano

A non-uniform (skewed) mixture of probability density functions occurs in various disciplines. One needs a measure of similarity to the respective constituents and its bounds. We introduce a skewed Jensen–Fisher divergence based on relative Fisher information, and provide some bounds in terms of the skewed Jensen–Shannon divergence and of the variational distance. The defined measure coincides with the definition from the skewed Jensen–Shannon divergence via the de Bruijn identity. Our results follow from applying the logarithmic Sobolev inequality and Poincaré inequality.





2020 ◽  
Vol 57 (1) ◽  
pp. 314-331
Author(s):  
Michael Falk ◽  
Simone A. Padoan ◽  
Stefano Rizzelli

AbstractIt is well known and readily seen that the maximum of n independent and uniformly on [0, 1] distributed random variables, suitably standardised, converges in total variation distance, as n increases, to the standard negative exponential distribution. We extend this result to higher dimensions by considering copulas. We show that the strong convergence result holds for copulas that are in a differential neighbourhood of a multivariate generalised Pareto copula. Sklar’s theorem then implies convergence in variational distance of the maximum of n independent and identically distributed random vectors with arbitrary common distribution function and (under conditions on the marginals) of its appropriately normalised version. We illustrate how these convergence results can be exploited to establish the almost-sure consistency of some estimation procedures for max-stable models, using sample maxima.





2018 ◽  
Vol 33 (2) ◽  
pp. 186-204 ◽  
Author(s):  
Jianping Yang ◽  
Wanwan Xia ◽  
Taizhong Hu

The relation between extropy and variational distance is studied in this paper. We determine the distribution which attains the minimum or maximum extropy among these distributions within a given variation distance from any given probability distribution, obtain the tightest upper bound on the difference of extropies of any two probability distributions subject to the variational distance constraint, and establish an analytic formula for the confidence interval of an extropy. Such a study parallels to that of Ho and Yeung [3] concerning entropy. However, the proofs of the main results in this paper are different from those in Ho and Yeung [3]. In fact, our arguments can simplify several proofs in Ho and Yeung [3].



Sign in / Sign up

Export Citation Format

Share Document