scholarly journals On Entropy, Information, and Conservation of Information

Entropy ◽  
2021 ◽  
Vol 23 (6) ◽  
pp. 779
Author(s):  
Yunus A. Çengel

The term entropy is used in different meanings in different contexts, sometimes in contradictory ways, resulting in misunderstandings and confusion. The root cause of the problem is the close resemblance of the defining mathematical expressions of entropy in statistical thermodynamics and information in the communications field, also called entropy, differing only by a constant factor with the unit ‘J/K’ in thermodynamics and ‘bits’ in the information theory. The thermodynamic property entropy is closely associated with the physical quantities of thermal energy and temperature, while the entropy used in the communications field is a mathematical abstraction based on probabilities of messages. The terms information and entropy are often used interchangeably in several branches of sciences. This practice gives rise to the phrase conservation of entropy in the sense of conservation of information, which is in contradiction to the fundamental increase of entropy principle in thermodynamics as an expression of the second law. The aim of this paper is to clarify matters and eliminate confusion by putting things into their rightful places within their domains. The notion of conservation of information is also put into a proper perspective.

2021 ◽  
pp. 1-30
Author(s):  
Cara Murray

The Dictionary of National Biography, published between 1885 and 1900, was one of Britain's biggest cyclopedia projects. The rampant expansion of the nation's archives, private collections, and museums produced an abundance of materials that frustrated the dictionary's editors, Leslie Stephen and Sidney Lee, especially because methodologies for making order of such materials were underdeveloped. Adding to their frustration was the sense of impending doom felt generally in Britain after the discovery of the second law of thermodynamics in 1859. Entropy put an end to the presiding belief in the infinite energy that fueled Britain's economic development and therefore challenged Victorian biography's premise that the capacity for self-development was boundless. Like the physicists of the era, these dictionary makers searched for ways to circumvent entropy's deadening force and reenergize their world. This project would not actually be achieved, however, until the twentieth century when Claude Shannon published his “Information Theory” in 1948. I argue that in an attempt to get out from under the chaos of information overload, the editors of the DNB invented new methods to organize information that anticipated Shannon's revolutionary theory and changed the way that we think, write, and work.


1928 ◽  
Vol 18 (4) ◽  
pp. 602-627 ◽  
Author(s):  
BH Balmukand

The study of the relation of plant-growth to environmental factors has led to much research directed to the elaboration of General Formulae expressing the quantitative response of the experimental plant or crop to the quantity of the nutrients with which it is supplied. For variations of a single nutrient only many different mathematical expressions will serve to describe the facts to the accuracy with which these are usually ascertained by experiment; the practical value of such formulae is, however, much impaired if the parameters or constants which they involve change their value from experiment to experiment. If, on the contrary, we can obtain formulae of a general character which represents satisfactorily not only the response to variation of a single factor, but the response to simultaneous variation of two or more different factors, then we have reason to believe that the parameters of such formulae will not depend upon the casual or non-essential conditions of the experiment, but will be capable of direct interpretation as physical quantities. We have shown that the Resistance Formula does fit the data of several two-factor experiments and the agreement of the three values of an determined from the three potato crops as well as the agreement of the difference in the values of k on a dunged and undunged plot with the potash expected to be available from the ten tons of dung shows that this expectation is so far justified. The parameters of the Resistance Formula are capable of a direct and definite physical interpretation; for each nutrient there are two constants; one represents the importance of the nutrient considered to the crop concerned, and may be expected to vary from crop to crop and from variety to variety, and so to afford a direct comparison between varieties of their manurial needs, while the second represents the amount of nutrient available in the unmanured soil.


1987 ◽  
Vol 1 (4) ◽  
pp. 225-242 ◽  
Author(s):  
John L.R. Proops

The term “entropy” is now widely used in social science, although its origin is in physical science. There are three main ways in which the term may be used. The first invokes the original meaning, referring to the unidirectionality of heat flow, from hot bodies to cold ones. The second meaning can be derived from the first via statistical mechanics; this meaning is concerned with measures of ‘evenness’ of ‘similarity’. The third meaning derives from information theory. The three distinct meanings are carefully described and distinguished, and their relationships to each other are discussed. The various uses of the three concepts in the social sciences are then reviewed, including some uses which confuse the different meanings of the term. Finally, modern work in thermodynamics is examined, and its implications for economic analysis are briefly assessed.


1972 ◽  
Vol 25 (2) ◽  
pp. 141-152
Author(s):  
E. W. Anderson

With the generous permission of Smiths Industries research is now being undertaken within the Department of Maritime Studies at the University of Wales Institute of Science and Technology, under the supervision of Captain C. H. Cotter. This has led to a study of information theory and its interesting links with entropy; information, as a result of work done, reduces randomness and introduces orderliness just as, in another context, housework reduces random dust and introduces cleanliness.


Author(s):  
Constantin Bratianu

AbstractThe purpose of this paper is to present the evolution of the concept of entropy from engineering to knowledge management, going through information theory, linguistic entropy, and economic entropy. The concept of entropy was introduced by Rudolf Clausius in thermodynamics in 1865 as a measure of heat transfer between two solid bodies which have different temperatures. As a natural phenomenon, heat flows from the body with a higher temperature toward the body with a lower temperature. However, Rudolf Clausius defined only the change in entropy of the system and not its absolute entropy. Ludwig Boltzmann defined later the absolute entropy by studying the gas molecules behavior in a thermal field. The computational formula defined by Boltzmann relates the microstates of a thermal system with its macrostates. The more uniform the probability distribution of the microstates is the higher the entropy is. The second law of thermodynamics says that in open systems, when there is no intervention from outside, the entropy of the system increases continuously. The concept of entropy proved to be very powerful, fact for which many researchers tried to extend its semantic area and the application domain. In 1948, Claude E. Shannon introduced the concept of information entropy, having the same computational formula as that defined by Boltzmann, but with a different interpretation. This concept solved many engineering communications problems and is used extensively in information theory. Nicholas Georgescu-Roegen used the concept of entropy and the second law of thermodynamics in economics and business. Today, many researchers in economics use the concept of entropy for analyzing different phenomena. The present paper explores the possibility of using the concept of knowledge entropy in knowledge management.


Author(s):  
Yehuda Roth

n our previous paper, we showed that the so-called quantum entanglement also exists in classical mechanics. The inability to measure this classical entanglement was rationalized with the definition of a classical observer which collapses all entanglement into distinguishable states. It was shown that evidence for this primary coherence is Newton’s third law. However, in reformulating a "classical entanglement theory" we assumed the existence of Newton’s second law as an operator form where a force operator was introduced through a Hilbert space of force states. In this paper, we derive all related physical quantities and laws from basic quantum principles. We not only define a force operator but also derive the classical mechanic's laws and prove the necessity of entanglement to obtain Newton’s third law.


Author(s):  
Stefan Thurner ◽  
Rudolf Hanel ◽  
Peter Klimekl

Most complex systems are statistical systems. Statsitical mechanics and information theory usually do not apply to complex systems because the latter break the assumptions of ergodicity, independence, and multinomial statistics. We show that it is possible to generalize the frameworks of statistical mechanics and information theory in a meaningful way, such that they become useful for understanding the statistics of complex systems.We clarify that the notion of entropy for complex systems is strongly dependent on the context where it is used, and differs if it is used as an extensive quantity, a measure of information, or as a tool for statistical inference. We show this explicitly for simple path-dependent complex processes such as Polya urn processes, and sample space reducing processes.We also show it is possible to generalize the maximum entropy principle to path-dependent processes and how this can be used to compute timedependent distribution functions of history dependent processes.


2011 ◽  
Vol 301-303 ◽  
pp. 1444-1447
Author(s):  
Hong Qiang Sun ◽  
Mei Yang ◽  
Wen Zhuo Zhang ◽  
Jing Sheng Yu

Using photoelectric sensor collects two physical quantities, displacement and time of the experiment car. The signal realizes conditioning after amplifier-filter circuit. It is transferred to computer through the serial portion of RS-232 after the disposing of Single-chip Computer. The experiment software designed by ourselves calculates size and relationship between the physical quantities relevant to acceleration, and draws the relevant figures of the physical quantities.


Sign in / Sign up

Export Citation Format

Share Document