rearrangement inequality
Recently Published Documents


TOTAL DOCUMENTS

28
(FIVE YEARS 2)

H-INDEX

6
(FIVE YEARS 0)

2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Chunxia Tao

AbstractThrough conformal map, isoperimetric inequalities are equivalent to the Hardy–Littlewood–Sobolev (HLS) inequalities involved with the Poisson-type kernel on the upper half space. From the analytical point of view, we want to know whether there exists a reverse analogue for the Poisson-type kernel. In this work, we give an affirmative answer to this question. We first establish the reverse Stein–Weiss inequality with the Poisson-type kernel, finding that the range of index 𝑝,q^{\prime} appearing in the reverse inequality lies in the interval (0,1), which is perfectly consistent with the feature of the index for the classical reverse HLS and Stein–Weiss inequalities. Then we give the existence and asymptotic behaviors of the extremal functions of this inequality. Furthermore, for the reverse HLS inequalities involving the Poisson-type kernel, we establish the regularity for the positive solutions to the corresponding Euler–Lagrange system and give the sufficient and necessary conditions of the existence of their solutions. Finally, in the conformal invariant index, we classify the extremal functions of the latter reverse inequality and compute the sharp constant. Our methods are based on the reversed version of the Hardy inequality in high dimension, Riesz rearrangement inequality and moving spheres.


Entropy ◽  
2018 ◽  
Vol 20 (12) ◽  
pp. 959 ◽  
Author(s):  
Mateu Sbert ◽  
Min Chen ◽  
Jordi Poch ◽  
Anton Bardera

Cross entropy and Kullback–Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to maximizing likelihood, and thus, cross entropy is applied for optimization in machine learning. K-L divergence also stands independently as a commonly used metric for measuring the difference between two distributions. In this paper, we introduce new inequalities regarding cross entropy and K-L divergence by using the fact that cross entropy is the negated logarithm of the weighted geometric mean. We first apply the well-known rearrangement inequality, followed by a recent theorem on weighted Kolmogorov means, and, finally, we introduce a new theorem that directly applies to inequalities between K-L divergences. To illustrate our results, we show numerical examples of distributions.


2018 ◽  
pp. 127-132
Author(s):  
Fernando Albiac ◽  
José L. Ansorena ◽  
Denny Leung ◽  
Ben Wallis

2017 ◽  
Vol 69 (5) ◽  
pp. 1036-1063 ◽  
Author(s):  
Eric Carlen ◽  
Francesco Maggi

AbstractWe provide a simple, general argument to obtain improvements of concentration-type inequalities starting from improvements of their corresponding isoperimetric-type inequalities. We apply this argument to obtain robust improvements of the Brunn-Minkowski inequality (for Minkowski sums between generic sets and convex sets) and of the Gaussian concentration inequality. The former inequality is then used to obtain a robust improvement of the Riesz rearrangement inequality under certain natural conditions. These conditions are compatible with the applications to a finite-range nonlocal isoperimetric problem arising in statistical mechanics


Inequalities ◽  
2012 ◽  
pp. 61-67 ◽  
Author(s):  
Zdravko Cvetkovski

Sign in / Sign up

Export Citation Format

Share Document