a priori distribution
Recently Published Documents


TOTAL DOCUMENTS

26
(FIVE YEARS 8)

H-INDEX

4
(FIVE YEARS 1)

2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Christian Palmes ◽  
Tobias Bluhmki ◽  
Benedikt Funke ◽  
Erich Bluhmki

Abstract The two one-sided t-tests (TOST) method is the most popular statistical equivalence test with many areas of application, i.e., in the pharmaceutical industry. Proper sample size calculation is needed in order to show equivalence with a certain power. Here, the crucial problem of choosing a suitable mean-difference in TOST sample size calculations is addressed. As an alternative concept, it is assumed that the mean-difference follows an a-priori distribution. Special interest is given to the uniform and some centered triangle a-priori distributions. Using a newly developed asymptotical theory a helpful analogy principle is found: every a-priori distribution corresponds to a point mean-difference, which we call its Schuirmann-constant. This constant does not depend on the standard deviation and aims to support the investigator in finding a well-considered mean-difference for proper sample size calculations in complex data situations. In addition to the proposed concept, we demonstrate that well-known sample size approximation formulas in the literature are in fact biased and state their unbiased corrections as well. Moreover, an R package is provided for a right away application of our newly developed concepts.


2021 ◽  
Vol 37 (2) ◽  
pp. 298-317
Author(s):  
Maria Guseva ◽  
◽  
Andrey Silaev ◽  

In the present research, the features of applying two models for estimating macroeconomic dynamic in the USA are investigated: Bayesian vector autoregression and Bayesian vector autoregression with Markov switching. The research goal is to identify periods, structure of fluctuations and the main directions of interaction of the variables (real US GDP and employment) using Bayesian vector autoregression models. Models with Markov chains include many equations (structures). The switching mechanisms between these structures are controlled by an unobservable variable that follows a first-order Markov process. The analyzed variables were taken from the first quarter of 1953 to the third quarter of 2015. The model parameters were estimated on the basis of a prior for the multivariate normal distribution — the inverse Wishart distribution (a generalization of the Minnesota a priori distribution). Basing on the results of the estimation of the two-dimensional model with Markov Switching the average GDP growth rate and expected duration of phases was calculated. The estimated model is acceptable for describing the US economy and with high accuracy describes the probability of being in a particular phase in different periods of time. On the basis of medium-term forecasts, root mean squared errors of the forecast are calculated and a conclusion is made about the most appropriate model. Within the framework of this paper, impulse response functions are built allowing to evaluate how variables in the model react on fluctuations, shocks.


2020 ◽  
Vol 21 (3) ◽  
pp. 237-244
Author(s):  
Alexander Mayer ◽  
Stefan Napel

Abstract Executive Directors of the International Monetary Fund elect the Fund’s Managing Director from a shortlist of three candidates; financial quotas of IMF members define the respective numbers of votes. The implied a priori distribution of success (preference satisfaction) is compared across different electoral procedures. The USA’s Executive Director can expect to come closer to its top preference under plurality rule than for pairwise majority comparisons or plurality with a runoff; opposite applies to everybody else. Differences of US success between voting rules dominate the within-rule differences between most other Directors, and much of the latest reform of quotas.


2019 ◽  
Vol 23 (4) ◽  
pp. 23-31 ◽  
Author(s):  
Aleksander A. Solodov

The aim of the study is to investigate the possibility of applying Bayesian adaptation algorithms to cognitive systems that perceive the Poisson process of external events.The method of research is the use of stochastic description and synthesis of cognitive systems, including the theory of doubly stochastic Poisson processes and the theory of Bayesian adaptation. The formal definition of cognitive systems in the state space in the spirit of similar definitions of the theory of dynamic systems is formulated. The definition has become a methodological basis for the development of models of those sets and transformations that are characteristic of cognitive systems. In particular, to describe the stochastic properties of cognitive systems and the possibility of creating an optimal algorithm, the Bayesian approach recognized in a number of philosophical works is applied.The optimal estimate by the criterion of the minimum standard error is, as is known, a posteriori mathematical expectation of a random estimated value, which is applied in this work. In this case, the well- known difficulty of using Bayesian optimal estimation is the need to set a priori probabilities of a random variable in the system under consideration. An adaptive Bayesian estimation algorithm, also known as the empirical Bayesian approach, is used to overcome this problem. According to the above it is believed that at the entrance of the cognitive system, namely in the unconscious in continuous time there are some events that are modeled by random points. The intensity of the appearance of points is determined by a random variable X, the evaluation of which is the task of the cognitive system as a whole. Up to some time in the field of the unconscious the number of random events accumulate (in mathematical language the classifying sample is formed). At some point, an attempt is made to estimate the value of X, i.e. an attempt to move information from the unconscious area of the cognitive system to the conscious, which is a mental act, an act of learning, etc. From a mathematical point of view, such a model of cognitive functioning is the implementation of an adaptive Bayesian approach, which allows to reduce the influence of a priori distribution of an unknown quantity on its evaluation.The described model of the cognitive system is justified by the fact that the value of X is not only random, but also with an unknown a priori distribution, is not observed directly, and in some way must be evaluated by the cognitive system on the basis of the already existing in the unconscious number of events and the last event on the basis of which.The optimal estimation of the random parameter is used to solve the problem of classification of observations, i.e. the optimal verification of the one-sided hypothesis by the Bayesian criterion.As a result of the undertaken consideration the applicability of the developed formal definition of cognitive system for the formulation of various problems of analysis and synthesis of systems is demonstrated. The advantage of the applied model is the minimum amount of a priori information about the processes occurring in the system. One assumption about the Poisson nature of the events occurring at the input of the system was sufficient.The results of a computational experiment on the adaptive estimation of a random parameter with an unknown a priori distribution are presented.In conclusion it is noted that the further development of the study can be a detailed formulation of the mathematical properties of the elements of the cognitive system mentioned in the definition, formulation, solution and interpretation of new mathematical problems of analysis and synthesis.


2019 ◽  
Vol 16 (04) ◽  
pp. 1941005
Author(s):  
Chunsheng Guo ◽  
Hanwen Lin ◽  
Zhen He ◽  
Xiaohu Shu ◽  
Xuguang Zhang

Crowd feature perception is an essential step for us to understand the crowd behavior. However, as the individuals present not only the sociality but also the randomness, there remain great challenges to extract the sociality of the individual directly. In this paper, we propose a crowd feature perception algorithm based on a sparse linear model (SLM). It builds the statistical characterization of the sociality by assuming a priori distribution of the SLM. First, we calculate the optical flow to extract the motion information of the crowd. Second, we input the video motion features to the sparse coding and generate the SLM. The super-Gaussian prior distributions in SLMs build the statistical characterization of the sociality. In addition, we combine the infinite Hidden Markov Model (iHMM) statistic model to determine whether the detected event is an abnormal event. We validate our method on UMN dataset and simulate dataset for abnormal detection, and the experiments show that this algorithm generates promising result compared with other state-of-art methods.


Geophysics ◽  
2019 ◽  
Vol 84 (3) ◽  
pp. R355-R369 ◽  
Author(s):  
Leonardo Azevedo ◽  
Vasily Demyanov

Geostatistical seismic inversion is commonly used to infer the spatial distribution of the subsurface petroelastic properties by perturbing the model parameter space through iterative stochastic sequential simulations/co-simulations. The spatial uncertainty of the inferred petroelastic properties is represented with the updated a posteriori variance from an ensemble of the simulated realizations. Within this setting, petroelastic realizations are generated assuming stationary and known large-scale geologic parameters (metaparameters), such as the spatial correlation model and the global a priori distribution of the properties of interest, for the entire inversion domain. This assumption leads to underestimation of the uncertainty associated with the inverted models. We have developed a practical framework to quantify uncertainty of the large-scale geologic parameters in geostatistical seismic inversion. The framework couples geostatistical seismic inversion with a stochastic adaptive sampling and Bayesian inference of the metaparameters to provide a more accurate and realistic prediction of uncertainty not restricted by heavy assumptions on large-scale geologic parameters. The proposed framework is illustrated with synthetic and real case studies. The results indicate the ability to retrieve more reliable acoustic impedance models with a more adequate uncertainty spread when compared with conventional geostatistical seismic inversion techniques. The proposed approach accounts for geologic uncertainty at the large scale (metaparameters) and the local scale (trace-by-trace inversion).


2019 ◽  
Vol 10 (2) ◽  
pp. 411
Author(s):  
Moacyr Machado Cardoso Junior

“Black swan” events represent a critical issue in risk analysis. Events with extremely low probability of occurrence are in general discarded from the risk analysis process. This paper aims to identify and characterize four accidents that occurred in Brazil into the following classes: “not a black swan”, “black swan: unknown-unknown”, “black swan: unknown-known” and “black Swan: not believed to occur”, by obtaining from experts the distribution of belief for the real probability of each class. Results showed that, throughout all cases analyzed, the class “black swan: unknown-unknown” was never reported, which means that none of the cases studied were a complete surprise to anyone. The method used was able to assign all accident events to the remaining classes. Probability distribution elicited from experts showed large disagreement among them, and the expected value was considered low. Nevertheless, the elicited distributions can be utilized in future risk analysis as a priori distribution in a Bayesian approach.


2019 ◽  
Vol 294 ◽  
pp. 03012 ◽  
Author(s):  
Alexander Trofimov ◽  
Albina Kuzmenko ◽  
Halyna Nesterenko ◽  
Svitlana Avramenko ◽  
Mykhailo Muzykin ◽  
...  

The evaluation of the parameters of multi-layered foundations (railroad basis, foundations of railway structures, etc.) plays an important role in ensuring the safe movement of trains. The method of estimating the mechanical and geometric parameters of such foundations based on the solutions of inverse problems for multi-layered elastic packets is proposed. As input data for such problems the measured displacements of certain points on the package surface are used. The method allows estimating the parameters of the a priori distribution of unknown variable parameters, identifying and excluding outliers of the measured data from the created model, and constructing a posteriori estimation of the unknown parameters probability density with acceptable resolution. Proposed method can be used to create a new generation of equipment intended for non-destructive monitoring and estimating of the condition of the railroad basis and the foundations of artificial structures. The appropriate software of such vehicles based on the developed methods of data processing can be developed. The use of such equipment allows to operatively analyzing the state of individual areas of the railroad to decide on the need of repairing or replacing the railroad base or foundation of other elements of railroad infrastructure.


2016 ◽  
Vol 11 (1) ◽  
pp. 80-92 ◽  
Author(s):  
Jean-Luc Moriceau ◽  
Isabela Paes

Purpose – The purpose of this paper is to show what we can learn from an aesthetics perspective on organizational learning, and especially about some power dynamics unseeable with other perspectives. Design/methodology/approach – An exploratory ethnographic study based on the turn-to-affect on the case of a theatre play in which many of the bearings that usually guide theatrical creation were removed. Findings – Analysis highlights that an a priori distribution of the sensible that locks routines, representations and roles is seldom questioned in organizational learning programs; the motion enabling organizational learning is less likely to be brought about by a change in power distribution than with the removal of some elements of power that freeze situations; organizational learning diffusion does not only go through norms, rules, values and repositories, but also through affects; and learning runs through a fragile communication of movements, always under the threat of becoming major knowledge and power distribution. Research limitations/implications – This paper is based on a single case. Practical implications – A too tight and close management of organizational learning is likely to thwart and limit its very learning possibilities. Originality/value – Several findings are in contradiction to technological or too managerial approaches to organizational learning. The study hopes to contribute by providing a supplement of complexity in our analysis of organizational learning, notably advocating for taking into account the role of affects, sensibility and the politics of aesthetics.


Tendencias ◽  
2015 ◽  
Vol 16 (1) ◽  
pp. 51
Author(s):  
Emilio José Chaves

Se discute el Criterio o Regla de Laplace y fundamenta su uso para construir la curva de Lorenz, CL, a partir de series de datos. Presenta ejemplos y gráficos de modelos de ajuste de la CL y de la FDA inferidas; comenta los límites del modelo. El método separa la media real, U, de la función de distribución adimensional (en medias), de modo que FDA(real) = U(real)*FDA(en medias). Busca fundamentar la inferencia estadística univariable de datos positivos a partir del criterio de Laplace, matemáticas clásicas y lógica de conjuntos.Este método no-paramétrico supone frecuencias 1/N idénticas para los N datos, sin usar funciones de distribución a-priori. Dada su sencillez, propone su empleo en educación estadística y su aplicación en investigación, como elemento teórico previo al manejo del análisis ultivariable.ABSTRACTIt discusses the rule or Laplace Criterion and fundaments its use to build the Lorenz Curve, LC, from datasets. It presents samples and graphs of inferred fitting models of LC and CDF; it comments the limits of the model. Method separates real media U, from adimentional CDF to work it as CDF(real)=U(real)*CDF(in medias). The purpose is to give fundamentals to univariate statistical inference of positive datases using Laplace Criterion, standard mathematics and Boolean sets theory. This nonparametric method assumes identical 1/N frequencies for N data without using a-priori distribution functions. Given its simplicity, it is proposed to apply it in statistical education and research as a theoretical element, prior to the handling of multivariate analysis.


Sign in / Sign up

Export Citation Format

Share Document