scholarly journals New bounds for Shannon, Relative and Mandelbrot entropies via Hermite interpolating polynomial

2018 ◽  
Vol 51 (1) ◽  
pp. 112-130
Author(s):  
Nasir Mehmood ◽  
Saad Ihsan Butt ◽  
Ðilda Pečarić ◽  
Josip Pečarić

AbstractTo procure inequalities for divergences between probability distributions, Jensen’s inequality is the key to success. Shannon, Relative and Zipf-Mandelbrot entropies have many applications in many applied sciences, such as, in information theory, biology and economics, etc. We consider discrete and continuous cyclic refinements of Jensen’s inequality and extend them from convex function to higher order convex function by means of different new Green functions by employing Hermite interpolating polynomial whose error term is approximated by Peano’s kernal. As an application of our obtained results, we give new bounds for Shannon, Relative and Zipf-Mandelbrot entropies.

2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Yongping Deng ◽  
Hidayat Ullah ◽  
Muhammad Adil Khan ◽  
Sajid Iqbal ◽  
Shanhe Wu

In this study, we present some new refinements of the Jensen inequality with the help of majorization results. We use the concept of convexity along with the theory of majorization and obtain refinements of the Jensen inequality. Moreover, as consequences of the refined Jensen inequality, we derive some bounds for power means and quasiarithmetic means. Furthermore, as applications of the refined Jensen inequality, we give some bounds for divergences, Shannon entropy, and various distances associated with probability distributions.


1992 ◽  
Vol 29 (3) ◽  
pp. 733-739
Author(s):  
K. B. Athreya

If φ is a convex function and X a random variable then (by Jensen's inequality) ψ φ (X) = Eφ (X) – φ (EX) is non-negative and 0 iff either φ is linear in the range of X or X is degenerate. So if φ is not linear then ψ φ (X) is a measure of non-degeneracy of the random variable X. For φ (x) = x2, ψ φ (X) is simply the variance V(X) which is additive in the sense that V(X + Y) = V(X) + V(Y) if X and Y are uncorrelated. In this note it is shown that if φ ″(·) is monotone non-increasing then ψ φ is sub-additive for all (X, Y) such that EX ≧ 0, P(Y ≧ 0) = 1 and E(X | Y) = EX w.p.l, and is additive essentially only if φ is quadratic. Thus, it confirms the unique role of variance as a measure of non-degeneracy. An application to branching processes is also given.


2020 ◽  
Vol 18 (1) ◽  
pp. 1748-1759
Author(s):  
Lei Xiao ◽  
Guoxiang Lu

Abstract In this paper, we present a new refinement of Jensen’s inequality with applications in information theory. The refinement of Jensen’s inequality is obtained based on the general functional in the work of Popescu et al. As the applications in information theory, we provide new tighter bounds for Shannon’s entropy and some f-divergences.


Information ◽  
2022 ◽  
Vol 13 (1) ◽  
pp. 39
Author(s):  
Neri Merhav

In this work, we propose both an improvement and extensions of a reverse Jensen inequality due to Wunder et al. (2021). The new proposed inequalities are fairly tight and reasonably easy to use in a wide variety of situations, as demonstrated in several application examples that are relevant to information theory. Moreover, the main ideas behind the derivations turn out to be applicable to generate bounds to expectations of multivariate convex/concave functions, as well as functions that are not necessarily convex or concave.


2021 ◽  
Vol 2021 ◽  
pp. 1-17
Author(s):  
Tahir Rasheed ◽  
Saad Ihsan Butt ◽  
Đilda Pečarić ◽  
Josip Pečarić ◽  
Ahmet Ocak Akdemir

We generalize Jensen’s integral inequality for real Stieltjes measure by using Montgomery identity under the effect of n − convex functions; also, we give different versions of Jensen’s discrete inequality along with its converses for real weights. As an application, we give generalized variants of Hermite–Hadamard inequality. Montgomery identity has a great importance as many inequalities can be obtained from Montgomery identity in q − calculus and fractional integrals. Also, we give applications in information theory for our obtained results, especially for Zipf and Hybrid Zipf–Mandelbrot entropies.


Filomat ◽  
2016 ◽  
Vol 30 (3) ◽  
pp. 803-814 ◽  
Author(s):  
Adil Khan ◽  
T. Ali ◽  
A. Kılıçman ◽  
Q. Din

In this paper our aim is to give refinements of Jensen?s type inequalities for the convex function defined on the co-ordinates of the bidimensional interval in the plane.


2022 ◽  
Vol 7 (4) ◽  
pp. 5328-5346
Author(s):  
Tareq Saeed ◽  
◽  
Muhammad Adil Khan ◽  
Hidayat Ullah ◽  

<abstract><p>The principal aim of this research work is to establish refinements of the integral Jensen's inequality. For the intended refinements, we mainly use the notion of convexity and the concept of majorization. We derive some inequalities for power and quasi–arithmetic means while utilizing the main results. Moreover, we acquire several refinements of Hölder inequality and also an improvement of Hermite–Hadamard inequality as consequences of obtained results. Furthermore, we secure several applications of the acquired results in information theory, which consist bounds for Shannon entropy, different divergences, Bhattacharyya coefficient, triangular discrimination and various distances.</p></abstract>


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Huixia Mo ◽  
Xin Sui ◽  
Dongyan Yu

We introduce the generalized convex function on fractal setsRα  (0<α≤1)of real line numbers and study the properties of the generalized convex function. Based on these properties, we establish the generalized Jensen’s inequality and generalized Hermite-Hadamard's inequality. Furthermore, some applications are given.


Sign in / Sign up

Export Citation Format

Share Document