scholarly journals Change of information content due to natural selection in populations with and without recombination

2020 ◽  
Author(s):  
Wolfgang A. Tiefenbrunner

AbstractThough evolution undoubtedly operates in accordance with the second law of thermodynamics, the law of disorder, during billions of years organisms of incredible complexity came into being. Natural selection was described by Darwin2 as a process of optimization of the adaptation to environment, but optimization doesn’t necessarily lead to higher intricacy. Methods of thermodynamics and thus of information theory could be suited for the examination of the increase of order and information due to evolution.Here I explain how to quantify the increase of information due to natural selection on the genotype and gene level using the observable change of allele frequencies. In populations with recombination (no linkage), the change of information content can be computed by summing up the contributions of all gene loci and thus gene loci can be treated as independent no matter what the fitness-landscape looks like. Pressure of deleterious mutations decreases information in a linear way, proportional to fitness loss and mutation rate.The information theoretical view on evolution might open new fields of research.

Author(s):  
Constantin Bratianu

AbstractThe purpose of this paper is to present the evolution of the concept of entropy from engineering to knowledge management, going through information theory, linguistic entropy, and economic entropy. The concept of entropy was introduced by Rudolf Clausius in thermodynamics in 1865 as a measure of heat transfer between two solid bodies which have different temperatures. As a natural phenomenon, heat flows from the body with a higher temperature toward the body with a lower temperature. However, Rudolf Clausius defined only the change in entropy of the system and not its absolute entropy. Ludwig Boltzmann defined later the absolute entropy by studying the gas molecules behavior in a thermal field. The computational formula defined by Boltzmann relates the microstates of a thermal system with its macrostates. The more uniform the probability distribution of the microstates is the higher the entropy is. The second law of thermodynamics says that in open systems, when there is no intervention from outside, the entropy of the system increases continuously. The concept of entropy proved to be very powerful, fact for which many researchers tried to extend its semantic area and the application domain. In 1948, Claude E. Shannon introduced the concept of information entropy, having the same computational formula as that defined by Boltzmann, but with a different interpretation. This concept solved many engineering communications problems and is used extensively in information theory. Nicholas Georgescu-Roegen used the concept of entropy and the second law of thermodynamics in economics and business. Today, many researchers in economics use the concept of entropy for analyzing different phenomena. The present paper explores the possibility of using the concept of knowledge entropy in knowledge management.


2021 ◽  
pp. 1-30
Author(s):  
Cara Murray

The Dictionary of National Biography, published between 1885 and 1900, was one of Britain's biggest cyclopedia projects. The rampant expansion of the nation's archives, private collections, and museums produced an abundance of materials that frustrated the dictionary's editors, Leslie Stephen and Sidney Lee, especially because methodologies for making order of such materials were underdeveloped. Adding to their frustration was the sense of impending doom felt generally in Britain after the discovery of the second law of thermodynamics in 1859. Entropy put an end to the presiding belief in the infinite energy that fueled Britain's economic development and therefore challenged Victorian biography's premise that the capacity for self-development was boundless. Like the physicists of the era, these dictionary makers searched for ways to circumvent entropy's deadening force and reenergize their world. This project would not actually be achieved, however, until the twentieth century when Claude Shannon published his “Information Theory” in 1948. I argue that in an attempt to get out from under the chaos of information overload, the editors of the DNB invented new methods to organize information that anticipated Shannon's revolutionary theory and changed the way that we think, write, and work.


Author(s):  
Roman V. Belavkin ◽  
Panos M. Pardalos ◽  
Jose C. Principe ◽  
Ruslan L. Stratonovich

Entropy ◽  
2019 ◽  
Vol 21 (12) ◽  
pp. 1170 ◽  
Author(s):  
Arieh Ben-Naim

This article is about the profound misuses, misunderstanding, misinterpretations and misapplications of entropy, the Second Law of Thermodynamics and Information Theory. It is the story of the “Greatest Blunder Ever in the History of Science”. It is not about a single blunder admitted by a single person (e.g., Albert Einstein allegedly said in connection with the cosmological constant, that this was his greatest blunder), but rather a blunder of gargantuan proportions whose claws have permeated all branches of science; from thermodynamics, cosmology, biology, psychology, sociology and much more.


T-Comm ◽  
2021 ◽  
Vol 15 (5) ◽  
pp. 62-66
Author(s):  
Aleksey V. Yudenkov ◽  
◽  
Aleksandr M. Volodchenkov ◽  
Liliya P. Rimskaya ◽  
◽  
...  

A simultaneous development of the fundamental research areas of the information theory is needed for efficient development in the information technologies. It is known that for the complicated macroscopic systems information evolution may be shaped on the basis of the principal thermodynamics laws (the second law of thermodynamics, etc). At the same time it is not known whether the fundamentals of the information theory for the macroscopic systems may be applicable to the microscopic systems. The study works out a mathematic model of the discrete phase space adapted to describing the evolution of information (entropy) of the microscopic systems. The discrete phase-space model rests on the indeterminacy principle and fundamental properties of the discrete continuous-time Markovian systems. The Kolmogorov equations represent the main mathematical tools technique. The suggested model refers to the smallest metric scale when the external macroscopic observation is possible. This scale can be viewed as a quasiclassical level. The research results are the following. The structure of the phase space of the elementary signal is revealed. It is demonstrated that the entropy of the microscopic systems increases, i.e. for the microscopic systems the second law of thermodynamics is true. There has been demonstrated transition from the microscopic model to the macroscopic one thus proving the former’s adequacy. The discrete phase-space model is promising in the aspect of further development. For example, it can be applied to the physical systems “particle – field”. The approach represented by the model will allow to study electromagnetic and gravity fields at the quasiclassical level. The above model of the discrete phase space and its application in the study of the evolution of the microscopic systems is a proprietary design of the authors.


1988 ◽  
Vol 14 ◽  
pp. 187-207 ◽  
Author(s):  
Alexander Rosenberg

In The Structure of Biological Science (Rosenberg [1985]) I argued that the theory of natural selection is a statistical theory for reasons much like those which makes thermodynamics a statistical theory. In particular, the theory claims that fitness differences are large enough and the life span of species long enough for increases in average fitness always to appear in the long run; and this claim, I held, is of the same form as the statistical version of the second law of thermodynamics.For the latter law also makes a claim about the long run, and its statistical character is due to this claim: thermodynamic systems must in the long run approach an equilibrium level of organization that maximizes entropy. Over finite times, given local boundary conditions, an isolated mechanical system, like the molecules in a container of gas, may sometimes interact so as to move the entropy of the system further from, instead of closer to the equilbrium level. But given enough interacting bodies, and enough time, the system will always eventually move in the direction prescribed by the law. Thus, we can attach much higher probabilities to the prediction that non-equilibrium systems will reflect greater entropy in future periods than we can to predictions that they will move in the opposite direction. And as we increase the amount of time and the number of bodies interacting, the strength of the probability we can attach to the prediction becomes greater and greater.


2013 ◽  
Vol 10 (85) ◽  
pp. 20130329 ◽  
Author(s):  
Philip J. Gerrish ◽  
Alexandre Colato ◽  
Paul D. Sniegowski

When mutation rates are low, natural selection remains effective, and increasing the mutation rate can give rise to an increase in adaptation rate. When mutation rates are high to begin with, however, increasing the mutation rate may have a detrimental effect because of the overwhelming presence of deleterious mutations. Indeed, if mutation rates are high enough: (i) adaptive evolution may be neutralized, resulting in a zero (or negative) adaptation rate despite the continued availability of adaptive and/or compensatory mutations, or (ii) natural selection may be neutralized, because the fitness of lineages bearing adaptive and/or compensatory mutations—whether established or newly arising—is eroded by excessive mutation, causing such lineages to decline in frequency. We apply these two criteria to a standard model of asexual adaptive evolution and derive mathematical expressions—some new, some old in new guise—delineating the mutation rates under which either adaptive evolution or natural selection is neutralized. The expressions are simple and require no a priori knowledge of organism- and/or environment-specific parameters. Our discussion connects these results to each other and to previous theory, showing convergence or equivalence of the different results in most cases.


Author(s):  
Ville R.I Kaila ◽  
Arto Annila

The second law of thermodynamics is a powerful imperative that has acquired several expressions during the past centuries. Connections between two of its most prominent forms, i.e. the evolutionary principle by natural selection and the principle of least action, are examined. Although no fundamentally new findings are provided, it is illuminating to see how the two principles rationalizing natural motions reconcile to one law. The second law, when written as a differential equation of motion, describes evolution along the steepest descents in energy and, when it is given in its integral form, the motion is pictured to take place along the shortest paths in energy. In general, evolution is a non-Euclidian energy density landscape in flattening motion.


Sign in / Sign up

Export Citation Format

Share Document