scholarly journals Generalizations of cyclic refinements of Jensen's inequality by Lidstone's polynomial with applications in information theory

2020 ◽  
pp. 249-271 ◽  
Author(s):  
Nasir Mehmood ◽  
Saad Ihsan Butt ◽  
Đilda Pečarić ◽  
Josip Pečarić
2018 ◽  
Vol 51 (1) ◽  
pp. 112-130
Author(s):  
Nasir Mehmood ◽  
Saad Ihsan Butt ◽  
Ðilda Pečarić ◽  
Josip Pečarić

AbstractTo procure inequalities for divergences between probability distributions, Jensen’s inequality is the key to success. Shannon, Relative and Zipf-Mandelbrot entropies have many applications in many applied sciences, such as, in information theory, biology and economics, etc. We consider discrete and continuous cyclic refinements of Jensen’s inequality and extend them from convex function to higher order convex function by means of different new Green functions by employing Hermite interpolating polynomial whose error term is approximated by Peano’s kernal. As an application of our obtained results, we give new bounds for Shannon, Relative and Zipf-Mandelbrot entropies.


2020 ◽  
Vol 18 (1) ◽  
pp. 1748-1759
Author(s):  
Lei Xiao ◽  
Guoxiang Lu

Abstract In this paper, we present a new refinement of Jensen’s inequality with applications in information theory. The refinement of Jensen’s inequality is obtained based on the general functional in the work of Popescu et al. As the applications in information theory, we provide new tighter bounds for Shannon’s entropy and some f-divergences.


Information ◽  
2022 ◽  
Vol 13 (1) ◽  
pp. 39
Author(s):  
Neri Merhav

In this work, we propose both an improvement and extensions of a reverse Jensen inequality due to Wunder et al. (2021). The new proposed inequalities are fairly tight and reasonably easy to use in a wide variety of situations, as demonstrated in several application examples that are relevant to information theory. Moreover, the main ideas behind the derivations turn out to be applicable to generate bounds to expectations of multivariate convex/concave functions, as well as functions that are not necessarily convex or concave.


2021 ◽  
Vol 2021 ◽  
pp. 1-17
Author(s):  
Tahir Rasheed ◽  
Saad Ihsan Butt ◽  
Đilda Pečarić ◽  
Josip Pečarić ◽  
Ahmet Ocak Akdemir

We generalize Jensen’s integral inequality for real Stieltjes measure by using Montgomery identity under the effect of n − convex functions; also, we give different versions of Jensen’s discrete inequality along with its converses for real weights. As an application, we give generalized variants of Hermite–Hadamard inequality. Montgomery identity has a great importance as many inequalities can be obtained from Montgomery identity in q − calculus and fractional integrals. Also, we give applications in information theory for our obtained results, especially for Zipf and Hybrid Zipf–Mandelbrot entropies.


2022 ◽  
Vol 7 (4) ◽  
pp. 5328-5346
Author(s):  
Tareq Saeed ◽  
◽  
Muhammad Adil Khan ◽  
Hidayat Ullah ◽  

<abstract><p>The principal aim of this research work is to establish refinements of the integral Jensen's inequality. For the intended refinements, we mainly use the notion of convexity and the concept of majorization. We derive some inequalities for power and quasi–arithmetic means while utilizing the main results. Moreover, we acquire several refinements of Hölder inequality and also an improvement of Hermite–Hadamard inequality as consequences of obtained results. Furthermore, we secure several applications of the acquired results in information theory, which consist bounds for Shannon entropy, different divergences, Bhattacharyya coefficient, triangular discrimination and various distances.</p></abstract>


2010 ◽  
Vol 82 (1) ◽  
pp. 44-61 ◽  
Author(s):  
S. S. DRAGOMIR

AbstractSome new results related to Jensen’s celebrated inequality for convex functions defined on convex sets in linear spaces are given. Applications for norm inequalities in normed linear spaces and f-divergences in information theory are provided as well.


2016 ◽  
Vol 66 (1) ◽  
Author(s):  
Hossein Dehghan

AbstractIn this paper we present some new inequalities for the normalized Jensen functional which will much improve the famous Jensen’s inequality. We also give some applications in Analysis and Information Theory. Our results refine and generalize the corresponding ones announced in [DRAGOMIR, S. S.:


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Yongping Deng ◽  
Hidayat Ullah ◽  
Muhammad Adil Khan ◽  
Sajid Iqbal ◽  
Shanhe Wu

In this study, we present some new refinements of the Jensen inequality with the help of majorization results. We use the concept of convexity along with the theory of majorization and obtain refinements of the Jensen inequality. Moreover, as consequences of the refined Jensen inequality, we derive some bounds for power means and quasiarithmetic means. Furthermore, as applications of the refined Jensen inequality, we give some bounds for divergences, Shannon entropy, and various distances associated with probability distributions.


Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-8
Author(s):  
Muhammad Adil Khan ◽  
Zakir Husain ◽  
Yu-Ming Chu

Jensen’s inequality is one of the fundamental inequalities which has several applications in almost every field of science. In 2003, Mercer gave a variant of Jensen’s inequality which is known as Jensen–Mercer’s inequality. The purpose of this article is to propose new bounds for Csiszár and related divergences by means of Jensen–Mercer’s inequality. Also, we investigate several new bounds for Zipf–Mandelbrot entropy. The idea of this article may further stimulate research on information theory with the help of Jensen–Mercer’s inequality.


2013 ◽  
Vol 87 (2) ◽  
pp. 177-194 ◽  
Author(s):  
S. S. DRAGOMIR

AbstractTwo new reverses of the celebrated Jensen’s inequality for convex functions in the general setting of the Lebesgue integral, with applications to means, Hölder’s inequality and$f$-divergence measures in information theory, are given.


Sign in / Sign up

Export Citation Format

Share Document