On Entropy, Information Inequalities, and Groups

Author(s):  
Raymond W. Yeung
Entropy ◽  
2019 ◽  
Vol 22 (1) ◽  
pp. 11 ◽  
Author(s):  
Edward Bormashenko

Entropy is usually understood as the quantitative measure of “chaos” or “disorder”. However, the notions of “chaos” and “disorder” are definitely obscure. This leads to numerous misinterpretations of entropy. We propose to see the disorder as an absence of symmetry and to identify “ordering” with symmetrizing of a physical system; in other words, introducing the elements of symmetry into an initially disordered physical system. We demonstrate with the binary system of elementary magnets that introducing elements of symmetry necessarily diminishes its entropy. This is true for one-dimensional (1D) and two-dimensional (2D) systems of elementary magnets. Imposing symmetry does not influence the Landauer principle valid for the addressed systems. Imposing the symmetry restrictions onto the system built of particles contained within the chamber divided by the permeable partition also diminishes its entropy.


2018 ◽  
Vol 776 ◽  
pp. 10-16 ◽  
Author(s):  
Ana Alonso-Serrano ◽  
Matt Visser

2013 ◽  
Vol 2013 ◽  
pp. 1-3 ◽  
Author(s):  
Pantelimon-George Popescu ◽  
Florin Pop ◽  
Alexandru Herişanu ◽  
Nicolae Ţăpuş

We refine a classical logarithmic inequality using a discrete case of Bernoulli inequality, and then we refine furthermore two information inequalities between information measures for graphs, based on information functionals, presented by Dehmer and Mowshowitz in (2010) as Theorems 4.7 and 4.8. The inequalities refer to entropy-based measures of network information content and have a great impact for information processing in complex networks (a subarea of research in modeling of complex systems).


Sign in / Sign up

Export Citation Format

Share Document