scholarly journals TRANSITION AND EVOLUTION OF INFORMATION AT THE MICROSCOPIC LEVEL

T-Comm ◽  
2021 ◽  
Vol 15 (5) ◽  
pp. 62-66
Author(s):  
Aleksey V. Yudenkov ◽  
◽  
Aleksandr M. Volodchenkov ◽  
Liliya P. Rimskaya ◽  
◽  
...  

A simultaneous development of the fundamental research areas of the information theory is needed for efficient development in the information technologies. It is known that for the complicated macroscopic systems information evolution may be shaped on the basis of the principal thermodynamics laws (the second law of thermodynamics, etc). At the same time it is not known whether the fundamentals of the information theory for the macroscopic systems may be applicable to the microscopic systems. The study works out a mathematic model of the discrete phase space adapted to describing the evolution of information (entropy) of the microscopic systems. The discrete phase-space model rests on the indeterminacy principle and fundamental properties of the discrete continuous-time Markovian systems. The Kolmogorov equations represent the main mathematical tools technique. The suggested model refers to the smallest metric scale when the external macroscopic observation is possible. This scale can be viewed as a quasiclassical level. The research results are the following. The structure of the phase space of the elementary signal is revealed. It is demonstrated that the entropy of the microscopic systems increases, i.e. for the microscopic systems the second law of thermodynamics is true. There has been demonstrated transition from the microscopic model to the macroscopic one thus proving the former’s adequacy. The discrete phase-space model is promising in the aspect of further development. For example, it can be applied to the physical systems “particle – field”. The approach represented by the model will allow to study electromagnetic and gravity fields at the quasiclassical level. The above model of the discrete phase space and its application in the study of the evolution of the microscopic systems is a proprietary design of the authors.

Entropy ◽  
2021 ◽  
Vol 23 (5) ◽  
pp. 581
Author(s):  
Jaromir Tosiek ◽  
Maciej Przanowski

We focus on several questions arising during the modelling of quantum systems on a phase space. First, we discuss the choice of phase space and its structure. We include an interesting case of discrete phase space. Then, we introduce the respective algebras of functions containing quantum observables. We also consider the possibility of performing strict calculations and indicate cases where only formal considerations can be performed. We analyse alternative realisations of strict and formal calculi, which are determined by different kernels. Finally, two classes of Wigner functions as representations of states are investigated.


2013 ◽  
Vol 43 (7) ◽  
pp. 829-839 ◽  
Author(s):  
Carlos Sánchez-Azqueta ◽  
Cecilia Gimeno ◽  
Concepción Aldea ◽  
Santiago Celma

2014 ◽  
Vol 22 (1) ◽  
pp. 338 ◽  
Author(s):  
Roarke Horstmeyer ◽  
Changhuei Yang

1990 ◽  
Vol 103 (2) ◽  
pp. 309-316 ◽  
Author(s):  
A. Bonasera ◽  
G. F. Burgio ◽  
F. Gulminelli ◽  
H. H. Wolter

Author(s):  
Constantin Bratianu

AbstractThe purpose of this paper is to present the evolution of the concept of entropy from engineering to knowledge management, going through information theory, linguistic entropy, and economic entropy. The concept of entropy was introduced by Rudolf Clausius in thermodynamics in 1865 as a measure of heat transfer between two solid bodies which have different temperatures. As a natural phenomenon, heat flows from the body with a higher temperature toward the body with a lower temperature. However, Rudolf Clausius defined only the change in entropy of the system and not its absolute entropy. Ludwig Boltzmann defined later the absolute entropy by studying the gas molecules behavior in a thermal field. The computational formula defined by Boltzmann relates the microstates of a thermal system with its macrostates. The more uniform the probability distribution of the microstates is the higher the entropy is. The second law of thermodynamics says that in open systems, when there is no intervention from outside, the entropy of the system increases continuously. The concept of entropy proved to be very powerful, fact for which many researchers tried to extend its semantic area and the application domain. In 1948, Claude E. Shannon introduced the concept of information entropy, having the same computational formula as that defined by Boltzmann, but with a different interpretation. This concept solved many engineering communications problems and is used extensively in information theory. Nicholas Georgescu-Roegen used the concept of entropy and the second law of thermodynamics in economics and business. Today, many researchers in economics use the concept of entropy for analyzing different phenomena. The present paper explores the possibility of using the concept of knowledge entropy in knowledge management.


2021 ◽  
pp. 1-30
Author(s):  
Cara Murray

The Dictionary of National Biography, published between 1885 and 1900, was one of Britain's biggest cyclopedia projects. The rampant expansion of the nation's archives, private collections, and museums produced an abundance of materials that frustrated the dictionary's editors, Leslie Stephen and Sidney Lee, especially because methodologies for making order of such materials were underdeveloped. Adding to their frustration was the sense of impending doom felt generally in Britain after the discovery of the second law of thermodynamics in 1859. Entropy put an end to the presiding belief in the infinite energy that fueled Britain's economic development and therefore challenged Victorian biography's premise that the capacity for self-development was boundless. Like the physicists of the era, these dictionary makers searched for ways to circumvent entropy's deadening force and reenergize their world. This project would not actually be achieved, however, until the twentieth century when Claude Shannon published his “Information Theory” in 1948. I argue that in an attempt to get out from under the chaos of information overload, the editors of the DNB invented new methods to organize information that anticipated Shannon's revolutionary theory and changed the way that we think, write, and work.


Sign in / Sign up

Export Citation Format

Share Document