Some Recent Applications of Functional Equations and Inequalities to Characterizations of Probability Distributions, Combinatorics, Information Theory and Mathematical Economics

Author(s):  
J. Aczél
Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


2020 ◽  
pp. 464-490
Author(s):  
Miquel Feixas ◽  
Mateu Sbert

Around seventy years ago, Claude Shannon, who was working at Bell Laboratories, introduced information theory with the main purpose of dealing with the communication channel between source and receiver. The communication channel, or information channel as it later became known, establishes the shared information between the source or input and the receiver or output, both of which are represented by random variables, that is, by probability distributions over their possible states. The generality and flexibility of the information channel concept can be robustly applied to numerous, different areas of science and technology, even the social sciences. In this chapter, we will present examples of its application to select the best viewpoints of an object, to segment an image, and to compute the global illumination of a three-dimensional virtual scene. We hope that our examples will illustrate how the practitioners of different disciplines can use it for the purpose of organizing and understanding the interplay of information between the corresponding source and receiver.


2019 ◽  
Vol 16 (157) ◽  
pp. 20190162 ◽  
Author(s):  
Roland J. Baddeley ◽  
Nigel R. Franks ◽  
Edmund R. Hunt

At a macroscopic level, part of the ant colony life cycle is simple: a colony collects resources; these resources are converted into more ants, and these ants in turn collect more resources. Because more ants collect more resources, this is a multiplicative process, and the expected logarithm of the amount of resources determines how successful the colony will be in the long run. Over 60 years ago, Kelly showed, using information theoretic techniques, that the rate of growth of resources for such a situation is optimized by a strategy of betting in proportion to the probability of pay-off. Thus, in the case of ants, the fraction of the colony foraging at a given location should be proportional to the probability that resources will be found there, a result widely applied in the mathematics of gambling. This theoretical optimum leads to predictions as to which collective ant movement strategies might have evolved. Here, we show how colony-level optimal foraging behaviour can be achieved by mapping movement to Markov chain Monte Carlo (MCMC) methods, specifically Hamiltonian Monte Carlo (HMC). This can be done by the ants following a (noisy) local measurement of the (logarithm of) resource probability gradient (possibly supplemented with momentum, i.e. a propensity to move in the same direction). This maps the problem of foraging (via the information theory of gambling, stochastic dynamics and techniques employed within Bayesian statistics to efficiently sample from probability distributions) to simple models of ant foraging behaviour. This identification has broad applicability, facilitates the application of information theory approaches to understand movement ecology and unifies insights from existing biomechanical, cognitive, random and optimality movement paradigms. At the cost of requiring ants to obtain (noisy) resource gradient information, we show that this model is both efficient and matches a number of characteristics of real ant exploration.


2020 ◽  
Author(s):  
Milan Palus

<p>The mathematical formulation of causality in measurable terms of predictability was given by the father of cybernetics N. Wiener [1] and formulated for time series by C.W.J. Granger [2]. The Granger causality is based on the evaluation of predictability in bivariate autoregressive models. This concept has been generalized for nonlinear systems using methods rooted in information theory [3,4]. The information-theoretic approach, defining causality as information transfer, has been successful in many applications and generalized to multivariate data and causal networks [e.g., 5]. This approach, rooted in the information theory of Shannon, usually ignores two important properties of complex systems, such as the Earth climate: the systems evolve on multiple time scales and their variables have heavy-tailed probability distributions. While the multiscale character of complex dynamics, such as air temperature variability, can be studied within the Shannonian framework [6, 7], the entropy concepts of Rényi and Tsallis have been proposed to cope with variables with heavy-tailed probability distributions. We will discuss how such non-Shannonian entropy concepts can be applied in inference of causality in systems with heavy-tailed probability distributions and extreme events, using examples from the climate system.</p><p>This study was supported by the Czech Science Foundation, project GA19-16066S.</p><p> </p><p> [1] N. Wiener, in: E. F. Beckenbach (Editor), Modern Mathematics for Engineers (McGraw-Hill, New York, 1956)</p><p>[2] C.W.J. Granger, Econometrica 37 (1969) 424</p><p>[3] K. Hlaváčková-Schindler et al., Phys. Rep. 441 (2007)  1</p><p>[4] M. Paluš, M. Vejmelka, Phys. Rev. E 75 (2007) 056211</p><p>[5] J. Runge et al., Nature Communications 6 (2015) 8502</p><p>[6] M. Paluš, Phys. Rev. Lett. 112 (2014) 078702</p><p> [7] N. Jajcay, J. Hlinka, S. Kravtsov, A. A. Tsonis, M. Paluš, Geophys. Res. Lett. 43(2) (2016) 902–909</p>


Sign in / Sign up

Export Citation Format

Share Document