scholarly journals Landauer’s Principle a Consequence of Bit Flows, Given Stirling’s Approximation

Entropy ◽  
2021 ◽  
Vol 23 (10) ◽  
pp. 1288
Author(s):  
Sean Devine

According to Landauer’s principle, at least kBln2T Joules are needed to erase a bit that stores information in a thermodynamic system at temperature T. However, the arguments for the principle rely on a regime where the equipartition principle holds. This paper, by exploring a simple model of a thermodynamic system using algorithmic information theory, shows the energy cost of transferring a bit, or restoring the original state, is kBln2T Joules for a reversible system. The principle is a direct consequence of the statistics required to allocate energy between stored energy states and thermal states, and applies outside the validity of the equipartition principle. As the thermodynamic entropy of a system coincides with the algorithmic entropy of a typical state specifying the momentum degrees of freedom, it can quantify the thermodynamic requirements in terms of bit flows to maintain a system distant from the equilibrium set of states. The approach offers a simple conceptual understanding of entropy, while avoiding problems with the statistical mechanic’s approach to the second law of thermodynamics. Furthermore, the classical articulation of the principle can be used to derive the low temperature heat capacities, and is consistent with the quantum version of the principle.

Entropy ◽  
2018 ◽  
Vol 20 (10) ◽  
pp. 798
Author(s):  
Sean Devine

Algorithmic information theory in conjunction with Landauer’s principle can quantify the cost of maintaining a reversible real-world computational system distant from equilibrium. As computational bits are conserved in an isolated reversible system, bit flows can be used to track the way a highly improbable configuration trends toward a highly probable equilibrium configuration. In an isolated reversible system, all microstates within a thermodynamic macrostate have the same algorithmic entropy. However, from a thermodynamic perspective, when these bits primarily specify stored energy states, corresponding to a fluctuation from the most probable set of states, they represent “potential entropy”. However, these bits become “realised entropy” when, under the second law of thermodynamics, they become bits specifying the momentum degrees of freedom. The distance of a fluctuation from equilibrium is identified as the number of computational bits that move from stored energy states to momentum states to define a highly probable or typical equilibrium state. When reversibility applies, from Landauer’s principle, it costs k B l n 2 T Joules to move a bit within the system from stored energy states to the momentum states.


2021 ◽  
Vol 52 (1) ◽  
Author(s):  
L. Gavassino

AbstractThe standard argument for the Lorentz invariance of the thermodynamic entropy in equilibrium is based on the assumption that it is possible to perform an adiabatic transformation whose only outcome is to accelerate a macroscopic body, keeping its rest mass unchanged. The validity of this assumption constitutes the very foundation of relativistic thermodynamics and needs to be tested in greater detail. We show that, indeed, such a transformation is always possible, at least in principle. The only two assumptions invoked in the proof are that there is at least one inertial reference frame in which the second law of thermodynamics is valid and that the microscopic theory describing the internal dynamics of the body is a field theory, with Lorentz invariant Lagrangian density. The proof makes no reference to the connection between entropy and probabilities and is valid both within classical and quantum physics. To avoid any risk of circular reasoning, we do not postulate that the laws of thermodynamics are the same in every reference frame, but we obtain this fact as a direct consequence of the Lorentz invariance of the entropy.


2021 ◽  
pp. 2150111
Author(s):  
Fei-Quan Tu ◽  
Bin Sun ◽  
Meng Wan ◽  
Qi-Hong Huang

Entropy is a key concept widely used in physics and other fields. At the same time, the meaning of entropy with different names and the relationship among them are confusing. In this paper, we discuss the relationship among the Clausius entropy, Boltzmann entropy and information entropy and further show that the three kinds of entropy are equivalent to each other to some extent. Moreover, we point out that the evolution of the universe is a process of entropy increment and life originates from the original low entropy of the universe. Finally, we discuss the evolution of the entire universe composed of the cosmological horizon and the space surrounded by it and interpret the entropy as a measure of information of all microstates corresponding to a certain macrostate. Under this explanation, the thermodynamic entropy and information entropy are unified and we can conclude that the sum of the entropy of horizon and the entropy of matter in the space surrounded by the horizon does not decrease with time if the second law of thermodynamics holds for the entire universe.


2020 ◽  
Vol 2 (1) ◽  
pp. 32-35
Author(s):  
Eric Holloway

Leonid Levin developed the first stochastic conservation of information law, describing it as "torturing an uninformed witness cannot give information about the crime."  Levin's law unifies both the deterministic and stochastic cases of conservation of information.  A proof of Levin's law from Algorithmic Information Theory is given as well as a discussion of its implications in evolutionary algorithms and fitness functions.


Sci ◽  
2021 ◽  
Vol 3 (4) ◽  
pp. 35
Author(s):  
Peter Verheyen

How does the world around us work and what is real? This question has preoccupied humanity since its beginnings. From the 16th century onwards, it has periodically been necessary to revise the prevailing worldview—but things became very strange at the beginning of the 20th century with the advent of relativity theory and quantum physics. The current focus is on the role of information, there being a debate about whether this is ontological or epistemological. A theory has recently been formulated in which spacetime and gravity emerges from microscopic quantum information—more specifically from quantum entanglement via entanglement entropy. A recent theory describes the emergence of reality itself through first-person perspective experiences and algorithmic information theory. In quantum physics, perception and observation play a central role. Perception of and interaction with the environment require an exchange of information. Via biochemical projection, information is given an interpretation that is necessary to make life and consciousness possible. The world around us is not at all what it seems.


Author(s):  
Peter Verheyen

How does the world around us work and what is real? This question has preoccupied humanity since its beginnings. From the 16th century onwards, it has been periodically necessary to revise the prevailing worldview. But things became very strange at the beginning of the 20th century with the advent of relativity theory and quantum physics. The current focus is on the role of information, there being a debate about whether this is ontological or epistemological. A theory has recently been formulated in which spacetime and gravity emerges from microscopic quantum information, more specifically from quantum entanglement via entanglement entropy. A latest theory describes the emergence of reality itself through first-person perspective experiences and algorithmic information theory. In quantum physics, perception and observation play a central role. Perception, interaction with the environment, requires an exchange of information. Via biochemical projection, information is given an interpretation that is necessary to make life and consciousness possible. The world around us is not at all what it seems.


Sign in / Sign up

Export Citation Format

Share Document