discrete probability distributions
Recently Published Documents


TOTAL DOCUMENTS

149
(FIVE YEARS 30)

H-INDEX

14
(FIVE YEARS 1)

Author(s):  
Wolfgang Hornfeck

Abstract We present an illustrative analysis of the complexity of a crystal structure based on the application of Shannon’s entropy formula in the form of Krivovichev’s complexity measures and extended according to the contributions of distinct discrete probability distributions derived from the atomic numbers and the Wyckoff multiplicities and arities of the atoms and sites constituting the crystal structure, respectively. The results of a full crystallographic complexity partition analysis for the intermetallic phase Mo3Al2C, a compound of intermediate structural complexity, are presented, with all calculations performed in detail. In addition, a partial analysis is discussed for the crystal structures of α- and β-quartz.


2022 ◽  
pp. 325-353
Author(s):  
María Carmen Carnero ◽  
Javier Cárcel-Carrasco

The number of studies that assess the level of maintenance in a country is still very small, despite the contribution of this area to national competitiveness. The literature analyses asset management based on key performance indicators, but not via a multicriteria model. This chapter describes a multicriteria model, constructed by means of the fuzzy analytic hierarchy process (FAHP). The weightings are converted into utility functions, allowing the final utility of an alternative to be calculated via a multi-attribute utility function. Data on the state of asset management in Spain, in 2005 and 2010, are used to produce discrete probability distributions. Finally, a Monte Carlo simulation is applied to estimate the uncertainty of a complex function. In this way, the level of excellence of asset management in small businesses in Spain, before and after the recession, could be determined. The results show that the economic crisis experienced in Spain since 2008 has had a negative effect on the level of asset management in most sectors.


2021 ◽  
Vol 50 (1) ◽  
pp. 69-76
Author(s):  
Martin Grohe ◽  
Benjamin Lucien Kaminski ◽  
Joost-Pieter Katoen ◽  
Peter Lindner

Statistical models of real world data typically involve continuous probability distributions such as normal, Laplace, or exponential distributions. Such distributions are supported by many probabilistic modelling formalisms, including probabilistic database systems. Yet, the traditional theoretical framework of probabilistic databases focuses entirely on finite probabilistic databases. Only recently, we set out to develop the mathematical theory of infinite probabilistic databases. The present paper is an exposition of two recent papers which are cornerstones of this theory. In (Grohe, Lindner; ICDT 2020) we propose a very general framework for probabilistic databases, possibly involving continuous probability distributions, and show that queries have a well-defined semantics in this framework. In (Grohe, Kaminski, Katoen, Lindner; PODS 2020) we extend the declarative probabilistic programming language Generative Datalog, proposed by (B´ar´any et al. 2017) for discrete probability distributions, to continuous probability distributions and show that such programs yield generative models of continuous probabilistic databases.


Author(s):  
Om Parkash, Mukesh

Information theory fundamentally deals with two types of theoretical models frequently well acknowledged as entropy and divergence.  In the literature of entropy models for the discrete probability distributions, there survive numerous standard models but still there is possibility that several innovative models can be constructed so as to provide their applications in a variety of disciplines of mathematical sciences. The present communication is a right step in this direction and participates with the derivation of two new parametric models of entropy. Furthermore, it provides the meticulous study of the most advantageous properties of the projected discrete models to prove their legitimacy.


Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 417
Author(s):  
Ryan Sweke ◽  
Jean-Pierre Seifert ◽  
Dominik Hangleiter ◽  
Jens Eisert

Here we study the comparative power of classical and quantum learners for generative modelling within the Probably Approximately Correct (PAC) framework. More specifically we consider the following task: Given samples from some unknown discrete probability distribution, output with high probability an efficient algorithm for generating new samples from a good approximation of the original distribution. Our primary result is the explicit construction of a class of discrete probability distributions which, under the decisional Diffie-Hellman assumption, is provably not efficiently PAC learnable by a classical generative modelling algorithm, but for which we construct an efficient quantum learner. This class of distributions therefore provides a concrete example of a generative modelling problem for which quantum learners exhibit a provable advantage over classical learning algorithms. In addition, we discuss techniques for proving classical generative modelling hardness results, as well as the relationship between the PAC learnability of Boolean functions and the PAC learnability of discrete probability distributions.


2021 ◽  
Author(s):  
Uwe Ehret

<p>In this contribution, I will suggest an approach to build models as ordered and connected collections of multivariate, discrete probability distributions (dpd's). This approach can be seen as a Machine-Learning (ML) approach as it allows very flexible learning from data (almost) without prior constraints. Models can be built on dpd's only (fully data-based model), but they can also be included into existing process-based models at places where relations among data are not well-known (hybrid model). This provides flexibility for learning similar to including other ML approaches - e.g. Neural Networks - into process-based models, with the advantage that the dpd's can be investigated and interpreted by the modeler as long as their dimensionality remains low. Models based on dpd's are fundamentally probabilistic, and model responses for out-of-sample situations can be assured by dynamically coarse-graining the dpd's: The farther a predictive situation is from the learning situations, the coarser/more uncertain the prediction will be, and vice versa.</p><p>I will present the main elements and steps of such dpd-based modeling at the example of several systems, ranging from simple deterministic (ideal spring) to complex (hydrological system), and will discuss the influence of i) the size of the available training data set, ii) choice of the dpd priors, and iii) binning choices on the models' predictive power.</p>


2021 ◽  
Vol 35 (3) ◽  
pp. 748-795
Author(s):  
Arash Gholami Davoodi ◽  
Sean Chang ◽  
Hyun Gon Yoo ◽  
Anubhav Baweja ◽  
Mihir Mongia ◽  
...  

Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-6
Author(s):  
Nicholas Smaal ◽  
José Roberto C. Piqueira

This work presents a discussion about the application of the Kolmogorov; López-Ruiz, Mancini, and Calbet (LMC); and Shiner, Davison, and Landsberg (SDL) complexity measures to a common situation in physics described by the Maxwell–Boltzmann distribution. The first idea about complexity measure started in computer science and was proposed by Kolmogorov, calculated similarly to the informational entropy. Kolmogorov measure when applied to natural phenomena, presents higher values associated with disorder and lower to order. However, it is considered that high complexity must be associated to intermediate states between order and disorder. Consequently, LMC and SDL measures were defined and used in attempts to model natural phenomena but with the inconvenience of being defined for discrete probability distributions defined over finite intervals. Here, adapting the definitions to a continuous variable, the three measures are applied to the known Maxwell–Boltzmann distribution describing thermal neutron velocity in a power reactor, allowing extension of complexity measures to a continuous physical situation and giving possible discussions about the phenomenon.


Sign in / Sign up

Export Citation Format

Share Document