scholarly journals Information Entropy Algorithms for Image, Video, and Signal Processing

Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 926
Author(s):  
Gwanggil Jeon

Information entropy is a basic concept in information theory associated with any random variable [...]

Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


Entropy ◽  
2020 ◽  
Vol 22 (6) ◽  
pp. 621
Author(s):  
Gwanggil Jeon ◽  
Abdellah Chehri

Entropy, the key factor of information theory, is one of the most important research areas in computer science [...]


2012 ◽  
Vol 433-440 ◽  
pp. 5073-5077
Author(s):  
Jing Yao Wang ◽  
Meng Jia Li ◽  
Mei Song ◽  
Ying Hai Zhang

Information theory has made great impact on the research of communication systems. However, analyze and design of networks has not benefited too much from information theory. Therefore, in this paper, we propose the information-theoretical framework of context aware network to explore the relationship between the information and the network performance. We also analyze the information traffic process in context aware network. To illustrate our approach, we analyze the architecture of context aware network by the information entropy produced in the network, and discuss the way to improve the performance of context aware in an information-theoretic perspective. The results in this paper may be also used to design other network and guide the future network design.


2020 ◽  
Vol 6 (1) ◽  
pp. 114
Author(s):  
Saeid Maadani ◽  
Gholam Reza Mohtashami Borzadaran ◽  
Abdol Hamid Rezaei Roknabadi

The variance of Shannon information related to the random variable \(X\), which is called varentropy, is a measurement that indicates, how the information content of \(X\) is scattered around its entropy and explains its various applications in information theory, computer sciences, and statistics. In this paper, we introduce a new generalized varentropy based on the Tsallis entropy and also obtain some results and bounds for it. We compare the varentropy with the Tsallis varentropy. Moreover, we explain the Tsallis varentropy of the order statistics and analyse this concept in residual (past) lifetime distributions and then introduce two new classes of distributions by them.


2019 ◽  
Vol 1 ◽  
pp. 1-1
Author(s):  
Hong Zhang ◽  
Peichao Gao ◽  
Zhilin Li

<p><strong>Abstract.</strong> Spatial information is fundamentally important to our daily life. It has been estimated by many scholars that almost 80 percent or more of all information in this world are spatially referenced and can be regarded as spatial information. Given such importance, a discipline called spatial information theory has been formed since the late 20th century. In addition, international conferences on spatial information have been frequently held. For example, COSIT (Conference on Spatial Information Theory) was established in 1993 and are held every two years all over the world.</p><p>In spatial information theory, one fundamental question is how to measure the amount of information (i.e., information content) of a spatial dataset. A widely used method is to employ entropy, which is proposed by the American mathematician Claude Shannon in 1948 and usually referred to as Shannon entropy or information entropy. This information entropy was originally designed to measure the statistical information content of a telegraph message. However, a spatial dataset such as a map or a remote sensing image contains not only statistical information but also spatial information, which cannot be measured by using the information entropy.</p><p>As a consequence, considerable efforts have been made to improve the information entropy for spatial datasets in either a vector format of a raster format. There are two basic lines of thought. The first is to improve the information entropy by defining how to calculate its probability parameters, and the other is to introduce new parameters into the formula of the information entropy. The former results in a number of improved information entropies, while the latter leads to a series of variants of the information entropy. Both seem to be capable of distinguishing different spatial datasets, but there is a lack of comprehensive evaluation of their performance in measuring spatial information.</p><p>This study first presents a state-of-the-art review of the improvements to the information entropy for the information content of spatial datasets in a raster format (i.e., raster spatial data, such as a grey image and a digital elevation model). Then, it presents a comprehensive evaluation of the resultant measures (either improved information entropies or variants of the information entropy) according to the Second Law of Thermodynamics. A set of evaluation criteria were proposed, as well as corresponding measures. All resultant measures were ranked accordingly.</p><p>The results reported in this study should be useful for entropic spatial data analysis. For example, in image fusion, a crucial question is how to evaluate the performance of a fusion algorithm. This evaluation is usually achieved by using the information entropy to measure the increase in the information content during the fusion. It can now be performed by the best-improved information entropy reported in this study.</p>


Sign in / Sign up

Export Citation Format

Share Document