scholarly journals BOLTZMANN–SHANNON ENTROPY: GENERALIZATION AND APPLICATION

2006 ◽  
Vol 20 (23) ◽  
pp. 1471-1479 ◽  
Author(s):  
C. G. CHAKRABARTI ◽  
I. CHAKRABARTY

The paper deals with the generalization of both Boltzmann entropy and distribution in the light of most-probable interpretation of statistical equilibrium. The statistical analysis of the generalized entropy and distribution leads to some new interesting results of significant physical importance.

2020 ◽  
Vol 19 (04) ◽  
pp. 2050043 ◽  
Author(s):  
Hamidreza Namazi

In this paper, we employ the information theory to analyze the development of brain as the newborn ages. We compute the Shannon entropy of Electroencephalography (EEG) signal during sleep for 10 groups of newborns who are aged 36 weeks to 45 weeks (first to the last group). Based on the obtained results, EEG signals for newborns in 36 weeks have the lowest information content, whereas EEG signals for newborns in 45 weeks show the greatest information content. Therefore, we concluded that the information content of EEG signal increases as the age of newborn increases. Th result of statistical analysis demonstrated that the influence of increment of age of newborn on the variations of informant content of their EEG signals was significant.


2019 ◽  
Vol 1 ◽  
pp. 1-1 ◽  
Author(s):  
Peichao Gao ◽  
Hong Zhang ◽  
Zhilin Li

<p><strong>Abstract.</strong> Entropy is an important concept that originated in thermodynamics. It is the subject of the famous Second Law of Thermodynamics, which states that “the entropy of a closed system increases continuously and irrevocably toward a maximum” (Huettner 1976, 102) or “the disorder in the universe always increases” (Framer and Cook 2013, 21). Accordingly, it has been widely regarded as an ideal measure of disorder. Its computation can be theoretically performed according to the Boltzmann equation, which was proposed by the Austrian physicist Ludwig Boltzmann in 1872. In practice, however, the Boltzmann equation involves two problems that are difficult to solve, that is the definition of the macrostate of a system and the determination of the number of possible microstates in the microstate. As noted by the American sociologist Kenneth Bailey, “when the notion of entropy is extended beyond physics, researchers may not be certain how to specify and measure the macrostate/microstate relations” (Bailey 2009, 151). As a result, this entropy (also referred to as Boltzmann entropy and thermodynamic entropy) has remained largely at a conceptual level.</p><p> In practice, the widely used entropy is actually proposed by the American mathematician, electrical engineer, and cryptographer Claude Elwood Shannon in 1948, hence the term Shannon entropy. Shannon entropy was proposed to quantify the statistical disorder of telegraph messages in the area of communications. The quantification result was interpreted as the information content of a telegraph message, hence also the term information entropy. This entropy has served as the cornerstone of information theory and was introduced to various fields including chemistry, biology, and geography. It has been widely utilized to quantify the information content of geographic data (or spatial data) in either a vector format (i.e., vector data) or a raster format (i.e., raster data). However, only the statistical information of spatial data can be quantified by using Shannon entropy. The spatial information is ignored by Shannon entropy; for example, a grey image and its corresponding error image share the same Shannon entropy.</p><p> Therefore, considerable efforts have been made to improve the suitability of Shannon entropy for spatial data, and a number of improved Shannon entropies have been put forward. Rather than further improving Shannon entropy, this study introduces a novel strategy, namely shifting back from Shannon entropy to Boltzmann entropy. There are two advantages of employing Boltzmann entropy. First, as previously mentioned, Boltzmann entropy is the ideal, standard measure of disorder or information. It is theoretically capable of quantifying not only the statistical information but also the spatial information of a data set. Second, Boltzmann entropy can serve as the bridge between spatial patterns and thermodynamic interpretations. In this sense, the Boltzmann entropy of spatial data may have wider applications. In this study, Boltzmann entropy is employed to quantify the spatial information of raster data, such as images, raster maps, digital elevation models, landscape mosaics, and landscape gradients. To this end, the macrostate of raster data is defined, and the number of all possible microstates in the macrostate is determined. To demonstrate the usefulness of Boltzmann entropy, it is applied to satellite remote sensing image processing, and a comparison is made between its performance and that of Shannon entropy.</p>


2020 ◽  
pp. 58-77
Author(s):  
Sandip Tiwari

Chapter 2 brings forth the links between entropy and energy through their intimate link to information. Probabilities—as a statistical tool when there are unknowns—connect to information as well as to the various forms of entropy. Entropy is a variable introduced to characterize circumstances involving unknowns. Boltzmann entropy, von Neumann entropy, Shannon entropy and others can be viewed through this common viewpoint. This chapter broadens this discussion to include Fisher entropy—a measure that stresses locality—and the principle of minimum negentropy (or maximum entropy) to show how a variety of physical descriptions represented by equations such as the Schrödinger equation, diffusion equations, Maxwell-Boltzmann distributions, et cetera, can be seen through a probabilistic information-centric perspective.


2020 ◽  
Vol 9 (2) ◽  
pp. 103 ◽  
Author(s):  
Hong Zhang ◽  
Zhiwei Wu

Shannon entropy is the most popular method for quantifying information in a system. However, Shannon entropy is considered incapable of quantifying spatial data, such as raster data, hence it has not been applied to such datasets. Recently, a method for calculating the Boltzmann entropy of numerical raster data was proposed, but it is not efficient as it involves a series of numerical processes. We aimed to improve the computational efficiency of this method by borrowing the idea of head and tail breaks. This paper relaxed the condition of head and tail breaks and classified data with a heavy-tailed distribution. The average of the data values in a given class was regarded as its representative value, and this was substituted into a linear function to obtain the full expression of the relationship between classification level and Boltzmann entropy. The function was used to estimate the absolute Boltzmann entropy of the data. Our experimental results show that the proposed method is both practical and efficient; computation time was reduced to about 1% of the original method when dealing with eight 600 × 600 pixel digital elevation models.


1966 ◽  
Vol 24 ◽  
pp. 188-189
Author(s):  
T. J. Deeming

If we make a set of measurements, such as narrow-band or multicolour photo-electric measurements, which are designed to improve a scheme of classification, and in particular if they are designed to extend the number of dimensions of classification, i.e. the number of classification parameters, then some important problems of analytical procedure arise. First, it is important not to reproduce the errors of the classification scheme which we are trying to improve. Second, when trying to extend the number of dimensions of classification we have little or nothing with which to test the validity of the new parameters.Problems similar to these have occurred in other areas of scientific research (notably psychology and education) and the branch of Statistics called Multivariate Analysis has been developed to deal with them. The techniques of this subject are largely unknown to astronomers, but, if carefully applied, they should at the very least ensure that the astronomer gets the maximum amount of information out of his data and does not waste his time looking for information which is not there. More optimistically, these techniques are potentially capable of indicating the number of classification parameters necessary and giving specific formulas for computing them, as well as pinpointing those particular measurements which are most crucial for determining the classification parameters.


Author(s):  
Gianluigi Botton ◽  
Gilles L'espérance

As interest for parallel EELS spectrum imaging grows in laboratories equipped with commercial spectrometers, different approaches were used in recent years by a few research groups in the development of the technique of spectrum imaging as reported in the literature. Either by controlling, with a personal computer both the microsope and the spectrometer or using more powerful workstations interfaced to conventional multichannel analysers with commercially available programs to control the microscope and the spectrometer, spectrum images can now be obtained. Work on the limits of the technique, in terms of the quantitative performance was reported, however, by the present author where a systematic study of artifacts detection limits, statistical errors as a function of desired spatial resolution and range of chemical elements to be studied in a map was carried out The aim of the present paper is to show an application of quantitative parallel EELS spectrum imaging where statistical analysis is performed at each pixel and interpretation is carried out using criteria established from the statistical analysis and variations in composition are analyzed with the help of information retreived from t/γ maps so that artifacts are avoided.


Sign in / Sign up

Export Citation Format

Share Document