scholarly journals Information dynamics: temporal behavior of uncertainty measures

Open Physics ◽  
2008 ◽  
Vol 6 (1) ◽  
Author(s):  
Piotr Garbaczewski

AbstractWe carry out a systematic study of uncertainty measures that are generic to dynamical processes of varied origins, provided they induce suitable continuous probability distributions. The major technical tools are the information theory methods and inequalities satisfied by Fisher and Shannon information measures. We focus on the compatibility of these inequalities with the prescribed (deterministic, random or quantum) temporal behavior of pertinent probability densities.

2021 ◽  
Author(s):  
Uwe Ehret

<p>In this contribution, I will – with examples from hydrology - make the case for information theory as a general language and framework for i) characterizing systems, ii) quantifying the information content in data, iii) evaluating how well models can learn from data, and iv) measuring how well models do in prediction. In particular, I will discuss how information measures can be used to characterize systems by the state space volume they occupy, their dynamical complexity, and their distance from equilibrium. Likewise, I will discuss how we can measure the information content of data through systematic perturbations, and how much information a model absorbs (or ignores) from data during learning. This can help building hybrid models that optimally combine information in data and general knowledge from physical and other laws, which is currently among the key challenges in machine learning applied to earth science problems.</p><p>While I will try my best to convince everybody of taking an information perspective henceforth, I will also name the related challenges: Data demands, binning choices, estimation of probability distributions from limited data, and issues with excessive data dimensionality.</p>


Axioms ◽  
2021 ◽  
Vol 10 (2) ◽  
pp. 59
Author(s):  
Bruno Carbonaro ◽  
Marco Menale

A complex system is a system involving particles whose pairwise interactions cannot be composed in the same way as in classical Mechanics, i.e., the result of interaction of each particle with all the remaining ones cannot be expressed as a sum of its interactions with each of them (we cannot even know the functional dependence of the total interaction on the single interactions). Moreover, in view of the wide range of its applications to biologic, social, and economic problems, the variables describing the state of the system (i.e., the states of all of its particles) are not always (only) the usual mechanical variables (position and velocity), but (also) many additional variables describing e.g., health, wealth, social condition, social rôle ⋯, and so on. Thus, in order to achieve a mathematical description of the problems of everyday’s life of any human society, either at a microscopic or at a macroscpoic scale, a new mathematical theory (or, more precisely, a scheme of mathematical models), called KTAP, has been devised, which provides an equation which is a generalized version of the Boltzmann equation, to describe in terms of probability distributions the evolution of a non-mechanical complex system. In connection with applications, the classical problems about existence, uniqueness, continuous dependence, and stability of its solutions turn out to be particularly relevant. As far as we are aware, however, the problem of continuous dependence and stability of solutions with respect to perturbations of the parameters expressing the interaction rates of particles and the transition probability densities (see Section The Basic Equations has not been tackled yet). Accordingly, the present paper aims to give some initial results concerning these two basic problems. In particular, Theorem 2 reveals to be stable with respect to small perturbations of parameters, and, as far as instability of solutions with respect to perturbations of parameters is concerned, Theorem 3 shows that solutions are unstable with respect to “large” perturbations of interaction rates; these hints are illustrated by numerical simulations that point out how much solutions corresponding to different values of parameters stay away from each other as t→+∞.


Author(s):  
Daniel Fulger ◽  
Enrico Scalas ◽  
Guido Germano

AbstractThe speed of many one-line transformation methods for the production of, for example, Lévy alpha-stable random numbers, which generalize Gaussian ones, and Mittag-Leffler random numbers, which generalize exponential ones, is very high and satisfactory for most purposes. However, fast rejection techniques like the ziggurat by Marsaglia and Tsang promise a significant speed-up for the class of decreasing probability densities, if it is possible to complement them with a method that samples the tails of the infinite support. This requires the fast generation of random numbers greater or smaller than a certain value. We present a method to achieve this, and also to generate random numbers within any arbitrary interval. We demonstrate the method showing the properties of the transformation maps of the above mentioned distributions as examples of stable and geometric stable random numbers used for the stochastic solution of the space-time fractional diffusion equation.


1970 ◽  
Vol 1 (12) ◽  
pp. 25 ◽  
Author(s):  
J. Ian Collins

Utilizing the hydrodynamic relationships for shoaling and refraction of waves approaching a shoreline over parallel bottom contours a procedure is developed to transform an arbitrary probability density of wave characteristics in deep water into the corresponding breaking characteristics in shallow Water A number of probability distributions for breaking wave characteristics are derived m terms of assumed deep water probability densities of wave heights wave lengths and angles of approach Some probability densities for wave heights at specific locations in the surf zone are computed for a Rayleigh distribution in deep water The probability computations are used to derive the expectation of energy flux and its distribution.


Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


This chapter presents a higher-order-logic formalization of the main concepts of information theory (Cover & Thomas, 1991), such as the Shannon entropy and mutual information, using the formalization of the foundational theories of measure, Lebesgue integration, and probability. The main results of the chapter include the formalizations of the Radon-Nikodym derivative and the Kullback-Leibler (KL) divergence (Coble, 2010). The latter provides a unified framework based on which most of the commonly used measures of information can be defined. The chapter then provides the general definitions that are valid for both discrete and continuous cases and then proves the corresponding reduced expressions where the measures considered are absolutely continuous over finite spaces.


2020 ◽  
pp. 464-490
Author(s):  
Miquel Feixas ◽  
Mateu Sbert

Around seventy years ago, Claude Shannon, who was working at Bell Laboratories, introduced information theory with the main purpose of dealing with the communication channel between source and receiver. The communication channel, or information channel as it later became known, establishes the shared information between the source or input and the receiver or output, both of which are represented by random variables, that is, by probability distributions over their possible states. The generality and flexibility of the information channel concept can be robustly applied to numerous, different areas of science and technology, even the social sciences. In this chapter, we will present examples of its application to select the best viewpoints of an object, to segment an image, and to compute the global illumination of a three-dimensional virtual scene. We hope that our examples will illustrate how the practitioners of different disciplines can use it for the purpose of organizing and understanding the interplay of information between the corresponding source and receiver.


Sign in / Sign up

Export Citation Format

Share Document