A New Refinement of the Jensen Inequality with Applications in Information Theory

2020 ◽  
Vol 44 (1) ◽  
pp. 267-278 ◽  
Author(s):  
Muhammad Adil Khan ◽  
Ɖilda Pečarić ◽  
Josip Pečarić
2012 ◽  
Vol 8 (1) ◽  
pp. 17-32 ◽  
Author(s):  
K. Jain ◽  
Ram Saraswat

A New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence MeasuresAn Information inequality by using convexity arguments and Jensen inequality is established in terms of Csiszar f-divergence measures. This inequality is applied in comparing particular divergences which play a fundamental role in Information theory, such as Kullback-Leibler distance, Hellinger discrimination, Chi-square distance, J-divergences and others.


2021 ◽  
Vol 12 (5) ◽  
pp. 1-27
Author(s):  
Faiza Rubab ◽  
Hira Nabi ◽  
Asif R. Khan

We give generalizations and refinements of Jensen and Jensen− Mercer inequalities by using weights which satisfy the conditions of Jensen and Jensen− Steffensen inequalities. We also give some refinements for discrete and integral version of generalized Jensen−Mercer inequality and shown to be an improvement of the upper bound for the Jensen’s difference given in [32]. Applications of our work include new bounds for some important inequalities used in information theory, and generalizing the relations among means.


Information ◽  
2022 ◽  
Vol 13 (1) ◽  
pp. 39
Author(s):  
Neri Merhav

In this work, we propose both an improvement and extensions of a reverse Jensen inequality due to Wunder et al. (2021). The new proposed inequalities are fairly tight and reasonably easy to use in a wide variety of situations, as demonstrated in several application examples that are relevant to information theory. Moreover, the main ideas behind the derivations turn out to be applicable to generate bounds to expectations of multivariate convex/concave functions, as well as functions that are not necessarily convex or concave.


Author(s):  
László Horváth

AbstractThe main purpose of this work is to present essential extensions of results in [7] and [8], and apply them to some special situations. Of particular interest is the refinement of the integral Jensen inequality for vector valued integrable functions. The applications related to four topics, namely f-divergences in information theory (an interesting refinement of the weighted geometric mean–arithmetic mean inequality is obtained as a consequence), norm inequalities, quasi-arithmetic means, Hölder’s and Minkowski’s inequalities.


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Yongping Deng ◽  
Hidayat Ullah ◽  
Muhammad Adil Khan ◽  
Sajid Iqbal ◽  
Shanhe Wu

In this study, we present some new refinements of the Jensen inequality with the help of majorization results. We use the concept of convexity along with the theory of majorization and obtain refinements of the Jensen inequality. Moreover, as consequences of the refined Jensen inequality, we derive some bounds for power means and quasiarithmetic means. Furthermore, as applications of the refined Jensen inequality, we give some bounds for divergences, Shannon entropy, and various distances associated with probability distributions.


Mathematics ◽  
2021 ◽  
Vol 9 (23) ◽  
pp. 3132
Author(s):  
Hidayat Ullah ◽  
Muhammad Adil Khan ◽  
Tareq Saeed

The Jensen inequality has been reported as one of the most consequential inequalities that has a lot of applications in diverse fields of science. For this reason, the Jensen inequality has become one of the most discussed developmental inequalities in the current literature on mathematical inequalities. The main intention of this article is to find some novel bounds for the Jensen difference while using some classes of twice differentiable convex functions. We obtain the proposed bounds by utilizing the power mean and Höilder inequalities, the notion of convexity and the prominent Jensen inequality for concave function. We deduce several inequalities for power and quasi-arithmetic means as a consequence of main results. Furthermore, we also establish different improvements for Hölder inequality with the help of obtained results. Moreover, we present some applications of the main results in information theory.


2013 ◽  
Vol 87 (2) ◽  
pp. 177-194 ◽  
Author(s):  
S. S. DRAGOMIR

AbstractTwo new reverses of the celebrated Jensen’s inequality for convex functions in the general setting of the Lebesgue integral, with applications to means, Hölder’s inequality and$f$-divergence measures in information theory, are given.


Sign in / Sign up

Export Citation Format

Share Document