kullback information
Recently Published Documents


TOTAL DOCUMENTS

24
(FIVE YEARS 2)

H-INDEX

9
(FIVE YEARS 0)

2019 ◽  
Vol 487 (3) ◽  
pp. 310-316
Author(s):  
Yu. G. Puzachenko ◽  
A. N. Krenke ◽  
M. Yu. Puzachenko ◽  
R. B. Sandlerskii ◽  
I. I. Shironya

The use of the apparatus of nonadditive statistical mechanics for the evaluation of thermodynamic ecosystem variables based on multispectral measurements of reflected solar radiation is discussed. The parameter q is accepted corresponding to the conditions of the Förster’s maximum of organization. On the basis of remote information (Landsat) the entropy, Kullback information, Förster measure of organization, free energy, exergy, bound and internal energy, energy costs for evapotranspiration and photosynthesis for q-index values measured for each pixel of remote sensing scenes. It is shown that the seasonal dynamics of the q-index and the organization measures fully correspond to consequences that follow from the theory of open nonequilibrium systems, and thermodynamic variables reflect well the current state of ecosystems.


2019 ◽  
Vol 126 ◽  
pp. 00049
Author(s):  
A. Skatkov ◽  
A. Bryukhovetskiy ◽  
D. Moiseev

The main features associated with the development of intelligent technology for detecting anomalies of ecosystems in the waters of the city of Sevastopol are considered. An approach is proposed, the feature of which is to ensure continuous monitoring of key environmental indicators presented in the form of heterogeneous information flows: hydrometeorological information, data on the level of pollution and air composition, soil, environmental monitoring, monitoring of maximum permissible emissions of harmfulsubstances in order to detect changes in the state of data flow monitoring. The proposed method for thedetection of anomalies of ecosystems of the water area is based on data clustering. We consider typicaloperations on clusters and main metrics based on the Kullback information measure.


Entropy ◽  
2018 ◽  
Vol 20 (8) ◽  
pp. 593 ◽  
Author(s):  
Frank Lad ◽  
Giuseppe Sanfilippo ◽  
Gianna Agrò

The refinement axiom for entropy has been provocative in providing foundations of information theory, recognised as thoughtworthy in the writings of both Shannon and Jaynes. A resolution to their concerns has been provided recently by the discovery that the entropy measure of a probability distribution has a dual measure, a complementary companion designated as “extropy”. We report here the main results that identify this fact, specifying the dual equations and exhibiting some of their structure. The duality extends beyond a simple assessment of entropy, to the formulation of relative entropy and the Kullback symmetric distance between two forecasting distributions. This is defined by the sum of a pair of directed divergences. Examining the defining equation, we notice that this symmetric measure can be generated by two other explicable pairs of functions as well, neither of which is a Bregman divergence. The Kullback information complex is constituted by the symmetric measure of entropy/extropy along with one of each of these three function pairs. It is intimately related to the total logarithmic score of two distinct forecasting distributions for a quantity under consideration, this being a complete proper score. The information complex is isomorphic to the expectations that the two forecasting distributions assess for their achieved scores, each for its own score and for the score achieved by the other. Analysis of the scoring problem exposes a Pareto optimal exchange of the forecasters’ scores that both are willing to engage. Both would support its evaluation for assessing the relative quality of the information they provide regarding the observation of an unknown quantity of interest. We present our results without proofs, as these appear in source articles that are referenced. The focus here is on their content, unhindered. The mathematical syntax of probability we employ relies upon the operational subjective constructions of Bruno de Finetti.


2006 ◽  
Vol DMTCS Proceedings vol. AG,... (Proceedings) ◽  
Author(s):  
Philippe Chassaing ◽  
Lucas Gerin

International audience Giroire has recently proposed an algorithm which returns the $\textit{approximate}$ number of distinct elements in a large sequence of words, under strong constraints coming from the analysis of large data bases. His estimation is based on statistical properties of uniform random variables in $[0,1]$. In this note we propose an optimal estimation, using Kullback information and estimation theory.


Sign in / Sign up

Export Citation Format

Share Document