Physical Sciences Forum
Latest Publications


TOTAL DOCUMENTS

13
(FIVE YEARS 13)

H-INDEX

0
(FIVE YEARS 0)

Published By MDPI AG

2673-9984

2021 ◽  
Vol 3 (1) ◽  
pp. 13
Author(s):  
Ahmad Yousefi ◽  
Ariel Caticha

The classical Density Functional Theory (DFT) is introduced as an application of entropic inference for inhomogeneous fluids in thermal equilibrium. It is shown that entropic inference reproduces the variational principle of DFT when information about the expected density of particles is imposed. This process introduces a family of trial density-parametrized probability distributions and, consequently, a trial entropy from which the preferred one is found using the method of Maximum Entropy (MaxEnt). As an application, the DFT model for slowly varying density is provided, and its approximation scheme is discussed.


2021 ◽  
Vol 3 (1) ◽  
pp. 12
Author(s):  
Ariel Caticha

The mathematical formalism of quantum mechanics is derived or “reconstructed” from more basic considerations of the probability theory and information geometry. The starting point is the recognition that probabilities are central to QM; the formalism of QM is derived as a particular kind of flow on a finite dimensional statistical manifold—a simplex. The cotangent bundle associated to the simplex has a natural symplectic structure and it inherits its own natural metric structure from the information geometry of the underlying simplex. We seek flows that preserve (in the sense of vanishing Lie derivatives) both the symplectic structure (a Hamilton flow) and the metric structure (a Killing flow). The result is a formalism in which the Fubini–Study metric, the linearity of the Schrödinger equation, the emergence of complex numbers, Hilbert spaces and the Born rule are derived rather than postulated.


2021 ◽  
Vol 3 (1) ◽  
pp. 11
Author(s):  
Christopher G. Albert ◽  
Ulrich Callies ◽  
Udo von Toussaint

We present an approach to enhance the performance and flexibility of the Bayesian inference of model parameters based on observations of the measured data. Going beyond the usual surrogate-enhanced Monte-Carlo or optimization methods that focus on a scalar loss, we place emphasis on a function-valued output of a formally infinite dimension. For this purpose, the surrogate models are built on a combination of linear dimensionality reduction in an adaptive basis of principal components and Gaussian process regression for the map between reduced feature spaces. Since the decoded surrogate provides the full model output rather than only the loss, it is re-usable for multiple calibration measurements as well as different loss metrics and, consequently, allows for flexible marginalization over such quantities and applications to Bayesian hierarchical models. We evaluate the method’s performance based on a case study of a toy model and a simple riverine diatom model for the Elbe river. As input data, this model uses six tunable scalar parameters as well as silica concentrations in the upper reach of the river together with the continuous time-series of temperature, radiation, and river discharge over a specific year. The output consists of continuous time-series data that are calibrated against corresponding measurements from the Geesthacht Weir station at the Elbe river. For this study, only two scalar inputs were considered together with a function-valued output and compared to an existing model calibration using direct simulation runs without a surrogate.


2021 ◽  
Vol 3 (1) ◽  
pp. 10
Author(s):  
Riko Kelter

The Full Bayesian Significance Test (FBST) has been proposed as a convenient method to replace frequentist p-values for testing a precise hypothesis. Although the FBST enjoys various appealing properties, the purpose of this paper is to investigate two aspects of the FBST which are sometimes observed as measure-theoretic inconsistencies of the procedure and have not been discussed rigorously in the literature. First, the FBST uses the posterior density as a reference for judging the Bayesian statistical evidence against a precise hypothesis. However, under absolutely continuous prior distributions, the posterior density is defined only up to Lebesgue null sets which renders the reference criterion arbitrary. Second, the FBST statistical evidence seems to have no valid prior probability. It is shown that the former aspect can be circumvented by fixing a version of the posterior density before using the FBST, and the latter aspect is based on its measure-theoretic premises. An illustrative example demonstrates the two aspects and their solution. Together, the results in this paper show that both of the two aspects which are sometimes observed as measure-theoretic inconsistencies of the FBST are not tenable. The FBST thus provides a measure-theoretically coherent Bayesian alternative for testing a precise hypothesis.


2021 ◽  
Vol 3 (1) ◽  
pp. 9
Author(s):  
John Skilling ◽  
Kevin Knuth

Why quantum? Why spacetime? We find that the key idea underlying both is uncertainty. In a world lacking probes of unlimited delicacy, our knowledge of quantities is necessarily accompanied by uncertainty. Consequently, physics requires a calculus of number pairs and not only scalars for quantity alone. Basic symmetries of shuffling and sequencing dictate that pairs obey ordinary component-wise addition, but they can have three different multiplication rules. We call those rules A, B and C. “A” shows that pairs behave as complex numbers, which is why quantum theory is complex. However, consistency with the ordinary scalar rules of probability shows that the fundamental object is not a particle on its Hilbert sphere but a stream represented by a Gaussian distribution. “B” is then applied to pairs of complex numbers (qubits) and produces the Pauli matrices for which its operation defines the space of four vectors. “C” then allows integration of what can then be recognised as energy-momentum into time and space. The picture is entirely consistent. Spacetime is a construct of quantum and not a container for it.


2021 ◽  
Vol 3 (1) ◽  
pp. 8
Author(s):  
Bruno Arderucio Costa ◽  
Pedro Pessoa

Motivated by applications of statistical mechanics in which the system of interest is spatially unconfined, we present an exact solution to the maximum entropy problem for assigning a stationary probability distribution on the phase space of an unconfined ideal gas in an anti-de Sitter background. Notwithstanding the gas’ freedom to move in an infinite volume, we establish necessary conditions for the stationary probability distribution solving a general maximum entropy problem to be normalizable and obtain the resulting probability for a particular choice of constraints. As a part of our analysis, we develop a novel method for identifying dynamical constraints based on local measurements. With no appeal to a priori information about globally defined conserved quantities, it is therefore applicable to a much wider range of problems.


2021 ◽  
Vol 3 (1) ◽  
pp. 7
Author(s):  
Andreas Kvas ◽  
Torsten Mayer-Gürr

Earth’s gravitational field provides invaluable insights into the changing nature of our planet. It reflects mass change caused by geophysical processes like continental hydrology, changes in the cryosphere or mass flux in the ocean. Satellite missions such as the NASA/DLR operated Gravity Recovery and Climate Experiment (GRACE), and its successor GRACE Follow-On (GRACE-FO) continuously monitor these temporal variations of the gravitational attraction. In contrast to other satellite remote sensing datasets, gravity field recovery is based on geophysical inversion which requires a global, homogeneous data coverage. GRACE and GRACE-FO typically reach this global coverage after about 30 days, so short-lived events such as floods, which occur on time frames from hours to weeks, require additional information to be properly resolved. In this contribution we treat Earth’s gravitational field as a stationary random process and model its spatio-temporal correlations in the form of a vector autoregressive (VAR) model. The satellite measurements are combined with this prior information in a Kalman smoother framework to regularize the inversion process, which allows us to estimate daily, global gravity field snapshots. To derive the prior, we analyze geophysical model output which reflects the expected signal content and temporal evolution of the estimated gravity field solutions. The main challenges here are the high dimensionality of the process, with a state vector size in the order of 103 to 104, and the limited amount of model output from which to estimate such a high-dimensional VAR model. We introduce geophysically motivated constraints in the VAR model estimation process to ensure a positive-definite covariance function.


2021 ◽  
Vol 3 (1) ◽  
pp. 6
Author(s):  
Sascha Ranftl ◽  
Wolfgang von der Linden

The quantification of uncertainties of computer simulations due to input parameter uncertainties is paramount to assess a model’s credibility. For computationally expensive simulations, this is often feasible only via surrogate models that are learned from a small set of simulation samples. The surrogate models are commonly chosen and deemed trustworthy based on heuristic measures, and substituted for the simulation in order to approximately propagate the simulation input uncertainties to the simulation output. In the process, the contribution of the uncertainties of the surrogate itself to the simulation output uncertainties is usually neglected. In this work, we specifically address the case of doubtful surrogate trustworthiness, i.e., non-negligible surrogate uncertainties. We find that Bayesian probability theory yields a natural measure of surrogate trustworthiness, and that surrogate uncertainties can easily be included in simulation output uncertainties. For a Gaussian likelihood for the simulation data, with unknown surrogate variance and given a generalized linear surrogate model, the resulting formulas reduce to simple matrix multiplications. The framework contains Polynomial Chaos Expansions as a special case, and is easily extended to Gaussian Process Regression. Additionally, we show a simple way to implicitly include spatio-temporal correlations. Lastly, we demonstrate a numerical example where surrogate uncertainties are in part negligible and in part non-negligible.


2021 ◽  
Vol 3 (1) ◽  
pp. 3
Author(s):  
Roland Preuss ◽  
Udo von Toussaint

A Gaussian-process surrogate model based on already acquired data is employed to approximate an unknown target surface. In order to optimally locate the next function evaluations in parameter space a whole variety of utility functions are at one’s disposal. However, good choice of a specific utility or a certain combination of them prepares the fastest way to determine a best surrogate surface or its extremum for lowest amount of additional data possible. In this paper, we propose to consider the global (integrated) variance as an utility function, i.e., to integrate the variance of the surrogate over a finite volume in parameter space. It turns out that this utility not only complements the tool set for fine tuning investigations in a region of interest but expedites the optimization procedure in toto.


2021 ◽  
Vol 2 (1) ◽  
pp. 5
Author(s):  
Katharina Rath ◽  
Christopher G. Albert ◽  
Bernd Bischl ◽  
Udo von Toussaint

Dynamics of many classical physics systems are described in terms of Hamilton’s equations. Commonly, initial conditions are only imperfectly known. The associated volume in phase space is preserved over time due to the symplecticity of the Hamiltonian flow. Here we study the propagation of uncertain initial conditions through dynamical systems using symplectic surrogate models of Hamiltonian flow maps. This allows fast sensitivity analysis with respect to the distribution of initial conditions and an estimation of local Lyapunov exponents (LLE) that give insight into local predictability of a dynamical system. In Hamiltonian systems, LLEs permit a distinction between regular and chaotic orbits. Combined with Bayesian methods we provide a statistical analysis of local stability and sensitivity in phase space for Hamiltonian systems. The intended application is the early classification of regular and chaotic orbits of fusion alpha particles in stellarator reactors. The degree of stochastization during a given time period is used as an estimate for the probability that orbits of a specific region in phase space are lost at the plasma boundary. Thus, the approach offers a promising way to accelerate the computation of fusion alpha particle losses.


Sign in / Sign up

Export Citation Format

Share Document