informational entropy
Recently Published Documents


TOTAL DOCUMENTS

92
(FIVE YEARS 32)

H-INDEX

13
(FIVE YEARS 2)

Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1647
Author(s):  
Anastasia A. Anashkina ◽  
Irina Yu. Petrushanko ◽  
Rustam H. Ziganshin ◽  
Yuriy L. Orlov ◽  
Alexei N. Nekrasov

Background: Analyzing the local sequence content in proteins, earlier we found that amino acid residue frequencies differ on various distances between amino acid positions in the sequence, assuming the existence of structural units. Methods: We used informational entropy of protein sequences to find that the structural unit of proteins is a block of adjacent amino acid residues—“information unit”. The ANIS (ANalysis of Informational Structure) method uses these information units for revealing hierarchically organized Elements of the Information Structure (ELIS) in amino acid sequences. Results: The developed mathematical apparatus gives stable results on the structural unit description even with a significant variation in the parameters. The optimal length of the information unit is five, and the number of allowed substitutions is one. Examples of the application of the method for the design of protein molecules, intermolecular interactions analysis, and the study of the mechanisms of functioning of protein molecular machines are given. Conclusions: ANIS method makes it possible not only to analyze native proteins but also to design artificial polypeptide chains with a given spatial organization and, possibly, function.


2021 ◽  
Vol 2131 (2) ◽  
pp. 022082
Author(s):  
T R Abdullaev ◽  
G U Juraev

Abstract The issues of limiting the use of binary logic for the further development of science engineering are discussed. The effectiveness of the use of the ternary number system at this stage in the development of information technologies is substantiated and shown. A method is proposed for increasing the informational entropy of plaintext by adding random data using ternary logic in the process of symmetric encryption. To reliably hide the added random data, the first transforming function is proposed to choose gamming with a key.


Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1396
Author(s):  
Emil Dinga ◽  
Camelia Oprean-Stan ◽  
Cristina-Roxana Tănăsescu ◽  
Vasile Brătian ◽  
Gabriela-Mariana Ionescu

The most known and used abstract model of the financial market is based on the concept of the informational efficiency (EMH) of that market. The paper proposes an alternative which could be named the behavioural efficiency of the financial market, which is based on the behavioural entropy instead of the informational entropy. More specifically, the paper supports the idea that, in the financial market, the only measure (if any) of the entropy is the available behaviours indicated by the implicit information. Therefore, the behavioural entropy is linked to the concept of behavioural efficiency. The paper argues that, in fact, in the financial markets, there is not a (real) informational efficiency, but there exists a behavioural efficiency instead. The proposal is based both on a new typology of information in the financial market (which provides the concept of implicit information—that is, that information ”translated” by the economic agents from observing the actual behaviours) and on a non-linear (more exactly, a logistic) curve linking the behavioural entropy to the behavioural efficiency of the financial markets. Finally, the paper proposes a synergic overcoming of both EMH and AMH based on the new concept of behavioural entropy in the financial market.


2021 ◽  
Author(s):  
Jia Luo ◽  
Ri-Gui Zhou ◽  
Wen-Wen Hu ◽  
Yao-Chong Li ◽  
Gao-Feng Luo

Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 965
Author(s):  
Yeon-Moon Choo ◽  
Ji-Min Kim ◽  
Ik-Tae An

Since the 1960s, many rivers have been destroyed as a consequence of the process of rapid urbanization. As accurate figures are important to repair rivers, there have been many research reports on methods to obtain the exact river slope and elevation. Until now, many research efforts have analyzed the river using measured river topographic factors, but when the flow velocity changes rapidly, such as during a flood, surveying is not easy; and due to cost, frequent measurements are difficult. Previous research has focused on the cross section of the river, so the information on the river longitudinal profile is insufficient. In this research, using informational entropy theory, equations are presented that can calculate the average river slope, river slope, and river longitudinal elevation for a river basin in real time. The applicability was analyzed through a comparison with the measured data of river characteristic factors obtained from the river plan. The parameters were calculated using informational entropy theory and nonlinear regression analysis using actual data, and then the longitudinal elevation entropy equation for each river and the average river slope were calculated. As a result of analyzing the applicability of the equations presented in this study by R2 and Root Mean Square Error, all R2 values were over 0.80, while RMSE values were analyzed to be between 0.54 and 2.79. Valid results can be obtained by calculating river characteristic factors.


2021 ◽  
Vol 887 ◽  
pp. 597-602
Author(s):  
E.L. Kuleshov ◽  
Vladimir S. Plotnikov ◽  
Evgenii V. Pustovalov ◽  
T.S. Ostachenova

This paper presents a model of a thin film formation process of an amorphous alloy as a sequential procedure when a conditional unit of substance is randomly thrown onto a substrate at each next step. The islands of a precipitant are generated on the substrate with an increase of number of steps (density defects of substance). We determine the probability distribution of an island area, which shows the maximum informational entropy. An algorithm for computing estimates of parameters of this distribution is obtained. The results of processing experimental data are presented. We demonstrate that the proposed distribution is more consistent with the experimental data than the Pareto distribution.


Entropy ◽  
2021 ◽  
Vol 23 (2) ◽  
pp. 148
Author(s):  
Maricel Agop ◽  
Stefan Andrei Irimiciuc ◽  
Adrian Ghenadi ◽  
Luminita Bibire ◽  
Stefan Toma ◽  
...  

In the framework of the multifractal hydrodynamic model, the correlations informational entropy–cross-entropy manages attractive and repulsive interactions through a multifractal specific potential. The classical dynamics associated with them imply Hubble-type effects, Galilei-type effects, and dependences of interaction constants with multifractal degrees at various scale resolutions, while the insertion of the relativistic amendments in the same dynamics imply multifractal transformations of a generalized Lorentz-type, multifractal metrics invariant to these transformations, and an estimation of the dimension of the multifractal Universe. In such a context, some correspondences with standard cosmologies are analyzed. Since the same types of interactions can also be obtained as harmonics mapping between the usual space and the hyperbolic plane, two measures with uniform and non-uniform temporal flows become functional, temporal measures analogous with Milne’s temporal measures in a more general manner. This work furthers the analysis published recently by our group in “Towards Interactions through Information in a Multifractal Paradigm”.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-6
Author(s):  
Nicholas Smaal ◽  
José Roberto C. Piqueira

This work presents a discussion about the application of the Kolmogorov; López-Ruiz, Mancini, and Calbet (LMC); and Shiner, Davison, and Landsberg (SDL) complexity measures to a common situation in physics described by the Maxwell–Boltzmann distribution. The first idea about complexity measure started in computer science and was proposed by Kolmogorov, calculated similarly to the informational entropy. Kolmogorov measure when applied to natural phenomena, presents higher values associated with disorder and lower to order. However, it is considered that high complexity must be associated to intermediate states between order and disorder. Consequently, LMC and SDL measures were defined and used in attempts to model natural phenomena but with the inconvenience of being defined for discrete probability distributions defined over finite intervals. Here, adapting the definitions to a continuous variable, the three measures are applied to the known Maxwell–Boltzmann distribution describing thermal neutron velocity in a power reactor, allowing extension of complexity measures to a continuous physical situation and giving possible discussions about the phenomenon.


2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Daniel Valente ◽  
Frederico Brito ◽  
Thiago Werlang

AbstractDissipative adaptation is a general thermodynamic mechanism that explains self-organization in a broad class of driven classical many-body systems. It establishes how the most likely (adapted) states of a system subjected to a given drive tend to be those following trajectories of highest work absorption, followed by dissipated heat to the reservoir. Here, we extend the dissipative adaptation phenomenon to the quantum realm. We employ a fully-quantized exactly solvable model, where the source of work on a three-level system is a single-photon pulse added to a zero-temperature infinite environment, a scenario that cannot be treated by the classical framework. We find a set of equalities relating adaptation likelihood, absorbed work, heat dissipation and variation of the informational entropy of the environment. Our proof of principle provides the starting point towards a quantum thermodynamics of driven self-organization.


2021 ◽  
Vol 30 (4) ◽  
pp. 30-35
Author(s):  
Yulia Viktorovna Alekseevа ◽  
◽  
Natalia Alexandrovna Sazonova ◽  
Oksana Mikhailovna Bondarenko ◽  
Oksana Anatolyevna Anishenko ◽  
...  

The state of the system «primary kidney of a Syrian hamster» at the stages of embryonic development was studied. The informational characteristic of the complexity and organization of the morphological system «primary kidney» was carried out by calculating the indicators: informational maximum entropy (Нmax): (Нmax) = Iog2n, where n is the number of classes, informational entropy (Н), as a measure of information uncertainty according to Shannon’s formula Н = -∑Рi Log2Pi, where Р – volume fraction of tissue components of the primary kidney; information organization of the system (S): relative entropy (h), coeffi cient of relative organization of the system (or coeffi cient of redundancy) (R): R = (Hmax-H) / Hmax 100%. Based on the studies carried out, it was found that the entropy of the «primary kidney» system in embryos increases by the end of the 1st stage of the organ’s life cycle to 1.07 bit, at the 2nd stage it decreases to 0.80 bit, which is associated with an increase in the proportion of mesenchyme and its predominance. over other structural components. The redundancy ratio increases in the middle of stage 1, then decreases towards the beginning of stage 2, then from the middle of stage 2, the redundancy ratio increases until the end of stage 3, which is associated with an increase in the stability of the system. The relative entropy during the observation period decreases from 52.54% at the beginning of stage 1 of the organ’s life cycle to 39.93% at stage 3.


Sign in / Sign up

Export Citation Format

Share Document