scholarly journals Irreversible Physical Process and Information Height

Author(s):  
Qin Liu ◽  
Quan Tang ◽  
Yuling Yi ◽  
Yu Feng

The time’s arrow of macroscopic physical phenomenon is reflected by irreversible physical process, which essentially occurs from small probability state to high probability state. In this paper, simplified models are proposed to understand the macroscopic physical process. In order to describe the information of a physical system, we defined the full self-information as "information height" to describe the complexity or difficulty of a macrostate of physical system. In this way, the direction of macroscopic physical process is from high information height to low information height. We can judge the direction of physical process by the information height. If we want the macroscopic physical process to evolve from the low information height state to the high information height state, the system need to add extra information and corresponding energy to increase the information height.

Author(s):  
Jozsef Garai

In the earliest days of science researchers were arguing philosophically what might be the reasonable explanation for an observed phenomenon. The majority of the contemporary scientific community claims that these arguments are useless because they do not add anything to our understanding of nature. The current consensus on the aim of science is that science collects facts (data) and discerns the order that exists between and among the various facts (e.g., Feynman 1985). According to this approach the mission of science is over when the phenomenon under investigation has been described. It is left to the philosophers to answer the question what is the governing physical process behind the observed physical phenomenon. Quantum mechanics is a good example of this approach, “It works, so we just have to accept it.” The consequence is that nearly 90 years after the development of quantum theory, there is still no consensus in the scientific community regarding the interpretation of the theory’s foundational building blocks (Schlosshauer et al. 2013). I believe that identifying the physical process governing a natural phenomenon is the responsibility of science. Dutailly (2013) expressed this quite well: A “black box” in the “cloud” which answers our questions correctly is not a scientific theory, if we have no knowledge of the basis upon which it has been designed. A scientific theory should provide a set of concepts and a formalism which can be easily and indisputably understood and used by the workers in the field. In this study the main unifying principle in chemistry, the periodic system of the chemical elements (PSCE) is investigated. The aim of the study is not only the description of the periodicity but also the understanding of the underlying physics resulting in the PSCE. By 1860 about 60 elements had been identified, and this initiated a quest to find their systematic arrangement. Based on similarities, Dobereiner (1829) in Germany suggested grouping the elements into triads.


Author(s):  
Daouda Niang Diatta ◽  
Antonio Lerario

AbstractWe prove that with “high probability” a random Kostlan polynomial in $$n+1$$ n + 1 many variables and of degree d can be approximated by a polynomial of “low degree” without changing the topology of its zero set on the sphere $$\mathbb {S}^n$$ S n . The dependence between the “low degree” of the approximation and the “high probability” is quantitative: for example, with overwhelming probability, the zero set of a Kostlan polynomial of degree d is isotopic to the zero set of a polynomial of degree $$O(\sqrt{d \log d})$$ O ( d log d ) . The proof is based on a probabilistic study of the size of $$C^1$$ C 1 -stable neighborhoods of Kostlan polynomials. As a corollary, we prove that certain topological types (e.g., curves with deep nests of ovals or hypersurfaces with rich topology) have exponentially small probability of appearing as zero sets of random Kostlan polynomials.


Author(s):  
Diogo PoÇas ◽  
Jeffery Zucker

Abstract Analog computation attempts to capture any type of computation, that can be realized by any type of physical system or physical process, including but not limited to computation over continuous measurable quantities. A pioneering model is the General Purpose Analog Computer (GPAC), initially presented by Shannon in 1941. The GPAC is capable of manipulating real-valued data streams; however, it has been shown to be strictly less powerful than other models of computation on the reals, such as computable analysis. In previous work, we proposed an extension of the Shannon GPAC, denoted LGPAC, designed to overcome its limitations. Not only is the LGPAC model capable of expressing computation over general data spaces $\mathcal{X}$, but it also directly incorporates approximating computations by means of a limit module. An important feature of this work is the generalisation of the framework of the computation theory from Banach to Fréchet spaces. In this paper, we compare the LGPAC with a digital model of computation based on effective representations (tracking computability). We establish general conditions under which LGPAC-generable functions are tracking computable.


Synthese ◽  
2021 ◽  
Author(s):  
Nir Fresco

AbstractA single physical process may often be described equally well as computing several different mathematical functions—none of which is explanatorily privileged. How, then, should the computational identity of a physical system be determined? Some computational mechanists hold that computation is individuated only by either narrow physical or functional properties. Even if some individuative role is attributed to environmental factors, it is rather limited. The computational semanticist holds that computation is individuated, at least in part, by semantic properties. She claims that the mechanistic account lacks the resources to individuate the computations performed by some systems, thereby leaving interesting cases of computational indeterminacy unaddressed. This article examines some of these views, and claims that more cases of computational indeterminacy can be addressed, if the system-environment interaction plays a greater role in individuating computations. A new, long-arm functional strategy for individuating computation is advanced.


2016 ◽  
Vol 106 (7) ◽  
pp. 1601-1631 ◽  
Author(s):  
Jakub Steiner ◽  
Colin Stewart

When an agent chooses between prospects, noise in information processing generates an effect akin to the winner's curse. Statistically unbiased perception systematically overvalues the chosen action because it fails to account for the possibility that noise is responsible for making the preferred action appear to be optimal. The optimal perception pattern exhibits a key feature of prospect theory, namely, overweighting of small probability events (and corresponding underweighting of high probability events). This bias arises to correct for the winner's curse effect. (JEL D11, D81, D82, D83)


2014 ◽  
Vol 53 (03) ◽  
pp. 186-194 ◽  
Author(s):  
A. M. Thomas ◽  
J. M. Dean ◽  
L. M. Olson ◽  
L. J. Cook

SummaryObjective: To compare results from high probability matched sets versus imputed matched sets across differing levels of linkage information.Methods: A series of linkages with varying amounts of available information were performed on two simulated datasets derived from multiyear motor vehicle crash (MVC) and hospital databases, where true matches were known. Distributions of high probability and imputed matched sets were compared against the true match population for occupant age, MVC county, and MVC hour. Regression models were fit to simulated log hospital charges and hospitalization status.Results: High probability and imputed matched sets were not significantly different from occupant age, MVC county, and MVC hour in high information settings (p > 0.999). In low information settings, high probability matched sets were significantly different from occupant age and MVC county (p < 0.002), but imputed matched sets were not (p > 0.493). High information settings saw no significant differences in inference of simulated log hospital charges and hospitalization status between the two methods. High probability and imputed matched sets were significantly different from the outcomes in low information settings; however, imputed matched sets were more robust.Conclusions: The level of information available to a linkage is an important con -sideration. High probability matched sets are suitable for high to moderate information settings and for situations involving case- specific analysis. Conversely, imputed matched sets are preferable for low information settings when conducting population-based analyses.


2016 ◽  
Vol 8 (2) ◽  
pp. 57 ◽  
Author(s):  
Oleksandr D. Nikolenko

<p class="1Body">Any physical process has temporary extent and can be realized only in case there is time reserve, necessary for it (a time resource). Physical clocks are the only type of devices by means of which we can measure and carry out experimental studies of a phenomenon of a current of physical time. Process of measurements of time intervals by means of clocks also demands expenses of the appropriate time resource for their work. The vicious circle leading to paradox results: we measure time by means of time. This paradox forms basic restrictions on experimental studies of the course of time as physical phenomenon.</p>


2002 ◽  
Vol 7 (3) ◽  
pp. 4-5

Abstract Different jurisdictions use the AMA Guides to the Evaluation of Permanent Impairment (AMA Guides) for different purposes, and this article reviews a specific jurisdictional definition in the Province of Ontario of catastrophic impairment that incorporates the AMA Guides. In Ontario, a whole person impairment (WPI) exceeding 54% or a mental or behavioral impairment of Class 4 or 5 qualifies the individual for catastrophic benefits, and individuals who do not meet the test receive a lesser benefit. By inference, this establishes a parity threshold among dissimilar injuries and dissimilar outcome assessment scales for benefits. In Ontario, the Glasgow Coma Scale (GCS) identifies patients who have a high probability of death or of severely disabled survival. The GCS recognizes gradations of vegetative state and disability, but translating the gradations for rating individual impairment on ordinal scales into a method of assessing percentage impairments cannot be done reliably, as explained in the AMA Guides, Fifth Edition. The AMA Guides also notes that mental and behavioral impairment in Class 4 (marked impairment) or 5 (extreme impairment) indicates “catastrophic impairment” by significantly impeding useful functioning (Class 4) or significantly impeding useful functioning and implying complete dependency on another person for care (Class 5). Translating the AMA Guides guidelines into ordinal scales cannot be done reliably.


Author(s):  
Hadar Ram ◽  
Dieter Struyf ◽  
Bram Vervliet ◽  
Gal Menahem ◽  
Nira Liberman

Abstract. People apply what they learn from experience not only to the experienced stimuli, but also to novel stimuli. But what determines how widely people generalize what they have learned? Using a predictive learning paradigm, we examined the hypothesis that a low (vs. high) probability of an outcome following a predicting stimulus would widen generalization. In three experiments, participants learned which stimulus predicted an outcome (S+) and which stimulus did not (S−) and then indicated how much they expected the outcome after each of eight novel stimuli ranging in perceptual similarity to S+ and S−. The stimuli were rings of different sizes and the outcome was a picture of a lightning bolt. As hypothesized, a lower probability of the outcome widened generalization. That is, novel stimuli that were similar to S+ (but not to S−) produced expectations for the outcome that were as high as those associated with S+.


Sign in / Sign up

Export Citation Format

Share Document