Entropy, Information Theory

1987 ◽  
Vol 1 (4) ◽  
pp. 225-242 ◽  
Author(s):  
John L.R. Proops

The term “entropy” is now widely used in social science, although its origin is in physical science. There are three main ways in which the term may be used. The first invokes the original meaning, referring to the unidirectionality of heat flow, from hot bodies to cold ones. The second meaning can be derived from the first via statistical mechanics; this meaning is concerned with measures of ‘evenness’ of ‘similarity’. The third meaning derives from information theory. The three distinct meanings are carefully described and distinguished, and their relationships to each other are discussed. The various uses of the three concepts in the social sciences are then reviewed, including some uses which confuse the different meanings of the term. Finally, modern work in thermodynamics is examined, and its implications for economic analysis are briefly assessed.


1972 ◽  
Vol 25 (2) ◽  
pp. 141-152
Author(s):  
E. W. Anderson

With the generous permission of Smiths Industries research is now being undertaken within the Department of Maritime Studies at the University of Wales Institute of Science and Technology, under the supervision of Captain C. H. Cotter. This has led to a study of information theory and its interesting links with entropy; information, as a result of work done, reduces randomness and introduces orderliness just as, in another context, housework reduces random dust and introduces cleanliness.


2021 ◽  
pp. 1-30
Author(s):  
Cara Murray

The Dictionary of National Biography, published between 1885 and 1900, was one of Britain's biggest cyclopedia projects. The rampant expansion of the nation's archives, private collections, and museums produced an abundance of materials that frustrated the dictionary's editors, Leslie Stephen and Sidney Lee, especially because methodologies for making order of such materials were underdeveloped. Adding to their frustration was the sense of impending doom felt generally in Britain after the discovery of the second law of thermodynamics in 1859. Entropy put an end to the presiding belief in the infinite energy that fueled Britain's economic development and therefore challenged Victorian biography's premise that the capacity for self-development was boundless. Like the physicists of the era, these dictionary makers searched for ways to circumvent entropy's deadening force and reenergize their world. This project would not actually be achieved, however, until the twentieth century when Claude Shannon published his “Information Theory” in 1948. I argue that in an attempt to get out from under the chaos of information overload, the editors of the DNB invented new methods to organize information that anticipated Shannon's revolutionary theory and changed the way that we think, write, and work.


Entropy ◽  
2021 ◽  
Vol 23 (6) ◽  
pp. 779
Author(s):  
Yunus A. Çengel

The term entropy is used in different meanings in different contexts, sometimes in contradictory ways, resulting in misunderstandings and confusion. The root cause of the problem is the close resemblance of the defining mathematical expressions of entropy in statistical thermodynamics and information in the communications field, also called entropy, differing only by a constant factor with the unit ‘J/K’ in thermodynamics and ‘bits’ in the information theory. The thermodynamic property entropy is closely associated with the physical quantities of thermal energy and temperature, while the entropy used in the communications field is a mathematical abstraction based on probabilities of messages. The terms information and entropy are often used interchangeably in several branches of sciences. This practice gives rise to the phrase conservation of entropy in the sense of conservation of information, which is in contradiction to the fundamental increase of entropy principle in thermodynamics as an expression of the second law. The aim of this paper is to clarify matters and eliminate confusion by putting things into their rightful places within their domains. The notion of conservation of information is also put into a proper perspective.


1977 ◽  
Vol 9 (4) ◽  
pp. 395-417 ◽  
Author(s):  
J A Walsh ◽  
M J Webber

The concepts of entropy and of information are increasingly used in spatial analysis. This paper analyses these ideas in order to show how measures of spatial distributions may be constructed from them. First, the information content of messages is examined and related to the notion of uncertainty. Then three information measures, due to Shannon, Brillouin, and Good, are derived and shown to be appropriate in analysing different spatial problems; in particular, the Shannon and Brillouin measures are extensively compared and the effects of sample size on them are investigated. The paper also develops appropriate multivariate analogues of the information measures. Finally, some comments are made on the relations between the concepts of entropy, information, and order.


2014 ◽  
Vol 33 ◽  
pp. 1460354 ◽  
Author(s):  
Neal G. Anderson

Landauer's Principle (LP) associates an entropy increase with the irreversible loss of information from a physical system. Clear statement, unambiguous interpretation, and proper application of LP requires precise, mutually consistent, and sufficiently general definitions for a set of interlocking fundamental notions and quantities (entropy, information, irreversibility, erasure). In this work, we critically assess some common definitions and quantities used or implied in statements of LP, and reconsider their definition within an alternative “referential” approach to physical information theory that embodies an overtly relational conception of physical information. We prove an inequality on the entropic cost of irreversible information loss within this context, as well as “referential analogs” of LP and its more general restatement by Bennett. Advantages of the referential approach for establishing fundamental limits on the physical costs of irreversible information loss in communication and computing systems are discussed throughout.


Sign in / Sign up

Export Citation Format

Share Document