An information-theoretic resolution of the ambiguity in the local hardness

2014 ◽  
Vol 16 (13) ◽  
pp. 6019-6026 ◽  
Author(s):  
Farnaz Heidar Zadeh ◽  
Patricio Fuentealba ◽  
Carlos Cárdenas ◽  
Paul W. Ayers

A definition of the local hardness, suitable for application in the local hard/soft acid/base principle, is derived by applying information theory.

Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 858
Author(s):  
Dongshan He ◽  
Qingyu Cai

In this paper, we present a derivation of the black hole area entropy with the relationship between entropy and information. The curved space of a black hole allows objects to be imaged in the same way as camera lenses. The maximal information that a black hole can gain is limited by both the Compton wavelength of the object and the diameter of the black hole. When an object falls into a black hole, its information disappears due to the no-hair theorem, and the entropy of the black hole increases correspondingly. The area entropy of a black hole can thus be obtained, which indicates that the Bekenstein–Hawking entropy is information entropy rather than thermodynamic entropy. The quantum corrections of black hole entropy are also obtained according to the limit of Compton wavelength of the captured particles, which makes the mass of a black hole naturally quantized. Our work provides an information-theoretic perspective for understanding the nature of black hole entropy.


1980 ◽  
Vol 58 (3) ◽  
pp. 302-306 ◽  
Author(s):  
Margaret M. Kayser ◽  
Peter Morand

The regioselectivity of epoxide ring opening can be analyzed in terms of hard–soft acid–base (HSAB) theory. The coordination of the hard acid with the oxygen atom of the oxirane ring produces a "pulling effect" which determines the direction of the ring opening. In the absence of a strong "pulling effect" the "pushing effect" of the approaching base is examined and the consequences of relative softness or hardness of the nucleophile on the regioselectivity of the ring opening are discussed.


2017 ◽  
Vol 28 (7) ◽  
pp. 954-966 ◽  
Author(s):  
Colin Bannard ◽  
Marla Rosner ◽  
Danielle Matthews

Of all the things a person could say in a given situation, what determines what is worth saying? Greenfield’s principle of informativeness states that right from the onset of language, humans selectively comment on whatever they find unexpected. In this article, we quantify this tendency using information-theoretic measures and report on a study in which we tested the counterintuitive prediction that children will produce words that have a low frequency given the context, because these will be most informative. Using corpora of child-directed speech, we identified adjectives that varied in how informative (i.e., unexpected) they were given the noun they modified. In an initial experiment ( N = 31) and in a replication ( N = 13), 3-year-olds heard an experimenter use these adjectives to describe pictures. The children’s task was then to describe the pictures to another person. As the information content of the experimenter’s adjective increased, so did children’s tendency to comment on the feature that adjective had encoded. Furthermore, our analyses suggest that children balance informativeness with a competing drive to ease production.


Entropy ◽  
2018 ◽  
Vol 20 (7) ◽  
pp. 534 ◽  
Author(s):  
Hector Zenil ◽  
Narsis Kiani ◽  
Jesper Tegnér

We introduce a definition of algorithmic symmetry in the context of geometric and spatial complexity able to capture mathematical aspects of different objects using as a case study polyominoes and polyhedral graphs. We review, study and apply a method for approximating the algorithmic complexity (also known as Kolmogorov–Chaitin complexity) of graphs and networks based on the concept of Algorithmic Probability (AP). AP is a concept (and method) capable of recursively enumerate all properties of computable (causal) nature beyond statistical regularities. We explore the connections of algorithmic complexity—both theoretical and numerical—with geometric properties mainly symmetry and topology from an (algorithmic) information-theoretic perspective. We show that approximations to algorithmic complexity by lossless compression and an Algorithmic Probability-based method can characterize spatial, geometric, symmetric and topological properties of mathematical objects and graphs.


2021 ◽  
Vol 9 ◽  
Author(s):  
Ted Sichelman

Many scholars have employed the term “entropy” in the context of law and legal systems to roughly refer to the amount of “uncertainty” present in a given law, doctrine, or legal system. Just a few of these scholars have attempted to formulate a quantitative definition of legal entropy, and none have provided a precise formula usable across a variety of legal contexts. Here, relying upon Claude Shannon's definition of entropy in the context of information theory, I provide a quantitative formalization of entropy in delineating, interpreting, and applying the law. In addition to offering a precise quantification of uncertainty and the information content of the law, the approach offered here provides other benefits. For example, it offers a more comprehensive account of the uses and limits of “modularity” in the law—namely, using the terminology of Henry Smith, the use of legal “boundaries” (be they spatial or intangible) that “economize on information costs” by “hiding” classes of information “behind” those boundaries. In general, much of the “work” performed by the legal system is to reduce legal entropy by delineating, interpreting, and applying the law, a process that can in principle be quantified.


In previous chapters, the authors provided a comprehensive framework that can be used in the formal probabilistic and information-theoretic analysis of a wide range of systems and protocols. In this chapter, they illustrate the usefulness of conducting this analysis using theorem proving by tackling a number of applications including a data compression application, the formal analysis of an anonymity-based MIX channel, and the properties of the onetime pad encryption system.


2019 ◽  
Vol 37 (2) ◽  
pp. 165-178
Author(s):  
Sarah A. Sauvé ◽  
Marcus T. Pearce

What makes a piece of music appear complex to a listener? This research extends previous work by Eerola (2016), examining information content generated by a computational model of auditory expectation (IDyOM) based on statistical learning and probabilistic prediction as an empirical definition of perceived musical complexity. We systematically manipulated the melody, rhythm, and harmony of short polyphonic musical excerpts using the model to ensure that these manipulations systematically varied information content in the intended direction. Complexity ratings collected from 28 participants were found to positively correlate most strongly with melodic and harmonic information content, which corresponded to descriptive musical features such as the proportion of out-of-key notes and tonal ambiguity. When individual differences were considered, these explained more variance than the manipulated predictors. Musical background was not a significant predictor of complexity ratings. The results support information content, as implemented by IDyOM, as an information-theoretic measure of complexity as well as extending IDyOM's range of applications to perceived complexity.


2019 ◽  
Vol 6 (9) ◽  
pp. 1883-1891 ◽  
Author(s):  
Shyamapada Nandi ◽  
Phil De Luna ◽  
Rahul Maity ◽  
Debanjan Chakraborty ◽  
Thomas Daff ◽  
...  

Using a simple hard–soft acid–base concept we have deliberately designed gas-specific and pressure dependent porosity into a non-porous solid via coordination flexibility.


Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 444
Author(s):  
Stephen Fox ◽  
Adrian Kotelba

Amidst certainty, efficiency can improve sustainability by reducing resource consumption. However, flexibility is needed to be able to survive when uncertainty increases. Apropos, sustainable production cannot persist in the long-term without having both flexibility and efficiency. Referring to cognitive science to inform the development of production systems is well established. However, recent research in cognitive science encompassing flexibility and efficiency in brain functioning have not been considered previously. In particular, research by others that encompasses information (I), information entropy (H), relative entropy (D), transfer entropy (TE), and brain entropy. By contrast, in this paper, flexibility and efficiency for persistent sustainable production is analyzed in relation to these information theory applications in cognitive science and is quantified in terms of information. Thus, this paper is consistent with the established practice of referring to cognitive science to inform the development of production systems. However, it is novel in addressing the need to combine flexibility and efficiency for persistent sustainability in terms of cognitive functioning as modelled with information theory.


Sign in / Sign up

Export Citation Format

Share Document