Morphological complexity and the minimum description length approach

Author(s):  
Östen Dahl

This discussion chapter focuses on some basic concepts used in the volume, relating them to ‘the minimum description length approach’. Implications of this approach for the organization of morphology are briefly discussed. Comparisons are made between Rescher's taxonomy of complexity with distinctions made in Nichols’ chapter and also in the work of Ackerman & Malouf. Several contributions build on the latter scholars’ distinction between ‘enumerative’ and ‘integrative’ complexity, motivating special attention to their approach. It is noted that their claim about the prevalence of low integrative complexity may be due to mathematical properties of the notion of ‘average conditional entropy’. Further topics are Nichols’ ‘canonical complexity’ and its relationships to other notions such as transparency and overspecification, and Meakins & Wilmoth's notion of ‘overabundance’. The concluding section notes that morphological complexity is not yet a solidified area of study and that information theory is likely to preserve its relevance.

Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


Author(s):  
Kenneth M. Sayre

Information theory was established in 1948 by Claude Shannon as a statistical analysis of factors pertaining to the transmission of messages through communication channels. Among basic concepts defined within the theory are information (the amount of uncertainty removed by the occurrence of an event), entropy (the average amount of information represented by events at the source of a channel), and equivocation (the ‘noise’ that impedes faithful transmission of a message through a channel). Information theory has proved essential to the development of space probes, high-speed computing machinery and modern communication systems. The information studied by Shannon is sharply distinct from information in the sense of knowledge or of propositional content. It is also distinct from most uses of the term in the popular press (‘information retrieval’, ‘information processing’, ‘information highway’, and so on). While Shannon’s work has strongly influenced academic psychology and philosophy, its reception in these disciplines has been largely impressionistic. A major problem for contemporary philosophy is to relate the statistical conceptions of information theory to information in the semantic sense of knowledge and content.


2020 ◽  
Vol 69 (6) ◽  
pp. 1163-1179 ◽  
Author(s):  
Kris V Parag ◽  
Christl A Donnelly

Abstract Estimating temporal changes in a target population from phylogenetic or count data is an important problem in ecology and epidemiology. Reliable estimates can provide key insights into the climatic and biological drivers influencing the diversity or structure of that population and evidence hypotheses concerning its future growth or decline. In infectious disease applications, the individuals infected across an epidemic form the target population. The renewal model estimates the effective reproduction number, R, of the epidemic from counts of observed incident cases. The skyline model infers the effective population size, N, underlying a phylogeny of sequences sampled from that epidemic. Practically, R measures ongoing epidemic growth while N informs on historical caseload. While both models solve distinct problems, the reliability of their estimates depends on p-dimensional piecewise-constant functions. If p is misspecified, the model might underfit significant changes or overfit noise and promote a spurious understanding of the epidemic, which might misguide intervention policies or misinform forecasts. Surprisingly, no transparent yet principled approach for optimizing p exists. Usually, p is heuristically set, or obscurely controlled via complex algorithms. We present a computable and interpretable p-selection method based on the minimum description length (MDL) formalism of information theory. Unlike many standard model selection techniques, MDL accounts for the additional statistical complexity induced by how parameters interact. As a result, our method optimizes p so that R and N estimates properly and meaningfully adapt to available data. It also outperforms comparable Akaike and Bayesian information criteria on several classification problems, given minimal knowledge of the parameter space, and exposes statistical similarities among renewal, skyline, and other models in biology. Rigorous and interpretable model selection is necessary if trustworthy and justifiable conclusions are to be drawn from piecewise models. [Coalescent processes; epidemiology; information theory; model selection; phylodynamics; renewal models; skyline plots]


Author(s):  
Erik Agrell ◽  
Alex Alvarado ◽  
Frank R. Kschischang

Recent decades have witnessed steady improvements in our ability to harness the information-carrying capability of optical fibres. Will this process continue, or will progress eventually stall? Information theory predicts that all channels have a limited capacity depending on the available transmission resources, and thus it is inevitable that the pace of improvements will slow. However, information theory also provides insights into how transmission resources should, in principle, best be exploited, and thus may serve as a guide for where to look for better ways to squeeze more out of a precious resource. This tutorial paper reviews the basic concepts of information theory and their application in fibre-optic communications.


Author(s):  
Anne Barton

Genetic factors are important in predisposing to nearly all of the conditions managed by rheumatologists; indeed, musculoskeletal diseases, like other complex diseases, are thought to be caused by environmental triggers in genetically susceptible individuals. Studying genetic susceptibility factors is more straightforward than environmental factors because, first, genetic changes are stable and do not vary throughout life; second, genetic changes exist before disease onset and so could be causative rather than occurring as a result of disease; and, third, genetic variation is easy to measure reliably using modern technologies. By comparison, environmental exposures can occur many years before disease onset, may vary during life, and are hard to accurately capture and measure. Enormous progress has been made in recent years in identifying susceptibility genes. This knowledge may allow better targeting of available therapies, the development of novel therapies, and an improved understanding of what determines disease severity in individual patients. In this chapter, the basic concepts in genetics are explained.


2020 ◽  
Vol 10 (22) ◽  
pp. 8266
Author(s):  
C. Aris Chatzidimitriou-Dreismann

During the last few decades, considerable advances in quantum information theory have shown deep existing connections between quantum correlation effects (like entanglement and quantum discord) and thermodynamics. Here the concept of conditional entropy plays a considerable role. In contrast to the classical case, quantum conditional entropy can take negative values. This counter-intuitive feature, already well understood in the context of information theory, was recently shown theoretically to also have a physical meaning in quantum thermodynamics [del Rio et al. Nature 2011, 474, 61]. Extending this existing work, here we provide evidence of the significance of negative conditional entropy in a concrete experimental context: Incoherent Neutron Scattering (INS) from protons of H2 in nano-scale environments; e.g., in INS from H2 in C-nanotubes, the data of the H2 translational motion along the nanotube axis seems to show that the neutron apparently scatters from a fictitious particle with mass of 0.64 atomic mass units (a.m.u.)—instead of the value of 2 a.m.u. as conventionally expected. An independent second experiment confirms this finding. However, taking into account the possible negativity of conditional entropy, we explain that this effect has a natural interpretation in terms of quantum thermodynamics. Moreover, it is intrinsically related to the number of qubits capturing the interaction of the two quantum systems H2 and C-nanotube. The considered effect may have technological applications (e.g., in H-storage materials and fuel cells).


2006 ◽  
Vol 20 (11n13) ◽  
pp. 1343-1362 ◽  
Author(s):  
THOMAS F. GEORGE ◽  
PETER H. HANDEL

The Quantum Information Theory Approach (QIT) explains for the first time the apparent lack of unitarity caused by the entropy increase in the Quantum 1/f Effect (Q1/fE). This allows for a deeper understanding of the quantum 1/f effect, showing no resultant entropy increase and therefore no violation of unitarity in the quantum-mechanical dynamical evolution. This new interpretation involves the von Neumann Quantum Entropy, including the negative conditional entropy concept for quantum entangled states introduced by QIT. The Q1/fE was applied to many high-tech systems, in particular to ultra small electronic devices in nanotechnology. The present paper explains how the additional entropy implied by the observed 1/f noise arises in spite of the entropy-conserving evolution of the system. On this basis, a derivation of the conventional and coherent quantum 1/f effect is given. The latter is derived from a non-relativistic form of the branch-point propagator derived by excluding the long range Coulomb interaction from the interaction hamiltonian. The paper concludes with examples of practical applications in various devices and systems, allowing for a new characterization of high technology.


Author(s):  
Solomon W. Golomb ◽  
Robert E. Peile ◽  
Robert A. Scholtz

Sign in / Sign up

Export Citation Format

Share Document