scholarly journals Refinements of Jensen’s Inequality via Majorization Results with Applications in the Information Theory

2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Yongping Deng ◽  
Hidayat Ullah ◽  
Muhammad Adil Khan ◽  
Sajid Iqbal ◽  
Shanhe Wu

In this study, we present some new refinements of the Jensen inequality with the help of majorization results. We use the concept of convexity along with the theory of majorization and obtain refinements of the Jensen inequality. Moreover, as consequences of the refined Jensen inequality, we derive some bounds for power means and quasiarithmetic means. Furthermore, as applications of the refined Jensen inequality, we give some bounds for divergences, Shannon entropy, and various distances associated with probability distributions.

2018 ◽  
Vol 51 (1) ◽  
pp. 112-130
Author(s):  
Nasir Mehmood ◽  
Saad Ihsan Butt ◽  
Ðilda Pečarić ◽  
Josip Pečarić

AbstractTo procure inequalities for divergences between probability distributions, Jensen’s inequality is the key to success. Shannon, Relative and Zipf-Mandelbrot entropies have many applications in many applied sciences, such as, in information theory, biology and economics, etc. We consider discrete and continuous cyclic refinements of Jensen’s inequality and extend them from convex function to higher order convex function by means of different new Green functions by employing Hermite interpolating polynomial whose error term is approximated by Peano’s kernal. As an application of our obtained results, we give new bounds for Shannon, Relative and Zipf-Mandelbrot entropies.


Information ◽  
2022 ◽  
Vol 13 (1) ◽  
pp. 39
Author(s):  
Neri Merhav

In this work, we propose both an improvement and extensions of a reverse Jensen inequality due to Wunder et al. (2021). The new proposed inequalities are fairly tight and reasonably easy to use in a wide variety of situations, as demonstrated in several application examples that are relevant to information theory. Moreover, the main ideas behind the derivations turn out to be applicable to generate bounds to expectations of multivariate convex/concave functions, as well as functions that are not necessarily convex or concave.


2022 ◽  
Vol 7 (4) ◽  
pp. 5328-5346
Author(s):  
Tareq Saeed ◽  
◽  
Muhammad Adil Khan ◽  
Hidayat Ullah ◽  

<abstract><p>The principal aim of this research work is to establish refinements of the integral Jensen's inequality. For the intended refinements, we mainly use the notion of convexity and the concept of majorization. We derive some inequalities for power and quasi–arithmetic means while utilizing the main results. Moreover, we acquire several refinements of Hölder inequality and also an improvement of Hermite–Hadamard inequality as consequences of obtained results. Furthermore, we secure several applications of the acquired results in information theory, which consist bounds for Shannon entropy, different divergences, Bhattacharyya coefficient, triangular discrimination and various distances.</p></abstract>


2013 ◽  
Vol 87 (2) ◽  
pp. 177-194 ◽  
Author(s):  
S. S. DRAGOMIR

AbstractTwo new reverses of the celebrated Jensen’s inequality for convex functions in the general setting of the Lebesgue integral, with applications to means, Hölder’s inequality and$f$-divergence measures in information theory, are given.


2016 ◽  
Vol 31 ◽  
pp. 125-133 ◽  
Author(s):  
Laszlo Horvath ◽  
Khuram Khan ◽  
Josip Pecaric

Refinements of the operator Jensen's inequality for convex and operator convex functions are given by using cyclic refinements of the discrete Jensen's inequality. Similar refinements are fairly rare in the literature. Some applications of the results to norm inequalities, the Holder McCarthy inequality and generalized weighted power means for operators are presented.


2020 ◽  
Vol 18 (1) ◽  
pp. 1748-1759
Author(s):  
Lei Xiao ◽  
Guoxiang Lu

Abstract In this paper, we present a new refinement of Jensen’s inequality with applications in information theory. The refinement of Jensen’s inequality is obtained based on the general functional in the work of Popescu et al. As the applications in information theory, we provide new tighter bounds for Shannon’s entropy and some f-divergences.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Iqrar Ansari ◽  
Khuram Ali Khan ◽  
Ammara Nosheen ◽  
Ðilda Pečarić ◽  
Josip Pečarić

AbstractThe main purpose of the presented paper is to obtain some time scale inequalities for different divergences and distances by using weighted time scales Jensen’s inequality. These results offer new inequalities in h-discrete calculus and quantum calculus and extend some known results in the literature. The lower bounds of some divergence measures are also presented. Moreover, the obtained discrete results are given in the light of the Zipf–Mandelbrot law and the Zipf law.


2021 ◽  
Vol 2021 ◽  
pp. 1-17
Author(s):  
Tahir Rasheed ◽  
Saad Ihsan Butt ◽  
Đilda Pečarić ◽  
Josip Pečarić ◽  
Ahmet Ocak Akdemir

We generalize Jensen’s integral inequality for real Stieltjes measure by using Montgomery identity under the effect of n − convex functions; also, we give different versions of Jensen’s discrete inequality along with its converses for real weights. As an application, we give generalized variants of Hermite–Hadamard inequality. Montgomery identity has a great importance as many inequalities can be obtained from Montgomery identity in q − calculus and fractional integrals. Also, we give applications in information theory for our obtained results, especially for Zipf and Hybrid Zipf–Mandelbrot entropies.


2011 ◽  
Vol 2011 ◽  
pp. 1-14 ◽  
Author(s):  
Jadranka Mićić ◽  
Zlatko Pavić ◽  
Josip Pečarić

We give an extension of Jensen's inequality for -tuples of self-adjoint operators, unital -tuples of positive linear mappings, and real-valued continuous convex functions with conditions on the operators' bounds. We also study operator quasiarithmetic means under the same conditions.


Sign in / Sign up

Export Citation Format

Share Document