A Novel Bayesian Framework for Uncertainty Management in Physics-Based Reliability Models

Author(s):  
M. Azarkhail ◽  
M. Modarres

The physics-of-failure (POF) modeling approach is a proven and powerful method to predict the reliability of mechanical components and systems. Most of POF models have been originally developed based upon empirical data from a wide range of applications (e.g. fracture mechanics approach to the fatigue life). Available curve fitting methods such as least square for example, calculate the best estimate of parameters by minimizing the distance function. Such point estimate approaches, basically overlook the other possibilities for the parameters and fail to incorporate the real uncertainty of empirical data into the process. The other important issue with traditional methods is when new data points become available. In such conditions, the best estimate methods need to be recalculated using the new and old data sets all together. But the original data sets, used to develop POF models may be no longer available to be combined with new data in a point estimate framework. In this research, for efficient uncertainty management in POF models, a powerful Bayesian framework is proposed. Bayesian approach provides many practical features such as a fair coverage of uncertainty and the updating concept that provide a powerful means for knowledge management, meaning that the Bayesian models allow the available information to be stored in a probability density format over the model parameters. These distributions may be considered as prior to be updated in the light of new data when they become available. At the first part of this article a brief review of classical and probabilistic approach to regression is presented. In this part the accuracy of traditional normal distribution assumption for error is examined and a new flexible likelihood function is proposed. The Bayesian approach to regression and its bonds with classical and probabilistic methods are explained next. In Bayesian section we shall discuss how the likelihood functions introduced in probabilistic approach, can be combined with prior information using the conditional probability concept. In order to highlight the advantages, the Bayesian approach is further clarified with case studies in which the result of calculation is compared with other traditional methods such as least square and maximum likelihood estimation (MLE) method. In this research, the mathematical complexity of Bayesian inference equations was overcome utilizing Markov Chain Monte Carlo simulation technique.

Author(s):  
Daiane Aparecida Zuanetti ◽  
Luis Aparecido Milan

In this paper, we propose a new Bayesian approach for QTL mapping of family data. The main purpose is to model a phenotype as a function of QTLs’ effects. The model considers the detailed familiar dependence and it does not rely on random effects. It combines the probability for Mendelian inheritance of parents’ genotype and the correlation between flanking markers and QTLs. This is an advance when compared with models which use only Mendelian segregation or only the correlation between markers and QTLs to estimate transmission probabilities. We use the Bayesian approach to estimate the number of QTLs, their location and the additive and dominance effects. We compare the performance of the proposed method with variance component and LASSO models using simulated and GAW17 data sets. Under tested conditions, the proposed method outperforms other methods in aspects such as estimating the number of QTLs, the accuracy of the QTLs’ position and the estimate of their effects. The results of the application of the proposed method to data sets exceeded all of our expectations.


Data Mining ◽  
2011 ◽  
pp. 1-26 ◽  
Author(s):  
Stefan Arnborg

This chapter reviews the fundamentals of inference, and gives a motivation for Bayesian analysis. The method is illustrated with dependency tests in data sets with categorical data variables, and the Dirichlet prior distributions. Principles and problems for deriving causality conclusions are reviewed, and illustrated with Simpson’s paradox. The selection of decomposable and directed graphical models illustrates the Bayesian approach. Bayesian and EM classification is shortly described. The material is illustrated on two cases, one in personalization of media distribution, one in schizophrenia research. These cases are illustrations of how to approach problem types that exist in many other application areas.


2021 ◽  
Author(s):  
Camila Ferreira Azevedo ◽  
Cynthia Barreto ◽  
Matheus Suela ◽  
Moysés Nascimento ◽  
Antônio Carlos Júnior ◽  
...  

Abstract Among the multi-trait models used to jointly study several traits and environments, the Bayesian framework has been a preferable tool for using a more complex and biologically realistic model. In most cases, the non-informative prior distributions are adopted in studies using the Bayesian approach. Still, the Bayesian approach tends to present more accurate estimates when it uses informative prior distributions. The present study was developed to evaluate the efficiency and applicability of multi-trait multi-environment (MTME) models under a Bayesian framework utilizing a strategy for eliciting informative prior distribution using previous data from rice. The study involved data pertained to rice genotypes in three environments and five agricultural years (2010/2011 until 2014/2015) for the following traits: grain yield (GY), flowering in days (FLOR) and plant height (PH). Variance components and genetic and non-genetic parameters were estimated by the Bayesian method. In general, the informative prior distribution in Bayesian MTME models provided higher estimates of heritability and variance components, as well as minor lengths for the highest probability density interval (HPD), compared to their respective non-informative prior distribution analyses. The use of more informative prior distributions makes it possible to detect genetic correlations between traits, which cannot be achieved with the use of non-informative prior distributions. Therefore, this mechanism presented for updating knowledge to the elicitation of an informative prior distribution can be efficiently applied in rice genetic selection.


2019 ◽  
Author(s):  
Pavlin Mavrodiev ◽  
Daniela Fleischmann ◽  
Gerald Kerth ◽  
Frank Schweitzer

AbstractLeading-following behaviour in Bechstein’s bats transfers information about suitable roost sites from experienced to inexperienced individuals, and thus ensures communal roosting. We analyze 9 empirical data sets about individualized leading-following (L/F) events, to infer rules that likely determine the formation of L/F pairs. To test these rules, we propose five models that differ regarding the empirical information taken into account to form L/F pairs: activity of a bat in exploring possible roosts, tendency to lead and to follow. The comparison with empirical data was done by constructing social networks from the observed L/F events, on which centralities were calculated to quantify the importance of individuals in these L/F networks. The centralities from the empirical network are then compared for statistical differences with the model-generated centralities obtained from 105 model realizations. We find that two models perform well in comparison with the empirical data: One model assumes an individual tendency to lead, but chooses followers at random. The other model assumes an individual tendency to follow and chooses leaders according to their overall activity. We note that neither individual preferences for specific individuals, nor other influences such as kinship or reciprocity, are taken into account to reproduce the empirical findings.


2011 ◽  
Vol 7 (S284) ◽  
pp. 46-48
Author(s):  
Gladis Magris C. ◽  
Cecilia Mateu ◽  
Gustavo Bruzual A.

AbstractWe use a bayesian formalism to quantify the uncertainties in the determination of the luminous mass and age of the dominant stellar population in a galaxy obtained from simple spectral fits. The analysis is performed over a sample of synthetic spectra covering a wide range of star formation histories and seen at different ages and redshifts. Using the bayesian approach we can establish quantitatively the uncertainties in the parameters derived from these fits in a straightforward manner, which is not possible using some simple algorithms, e.g. GASPEX, a non-negative least-square fitting algorithm.


2015 ◽  
Vol 15 (08) ◽  
pp. 1540026 ◽  
Author(s):  
Q. Hu ◽  
H. F. Lam ◽  
S. A. Alabi

The identification of railway ballast damage under a concrete sleeper is investigated by following the Bayesian approach. The use of a discrete modeling method to capture the distribution of ballast stiffness under the sleeper introduces artificial stiffness discontinuities between different ballast regions. This increases the effects of modeling errors and reduces the accuracy of the ballast damage detection results. In this paper, a continuous modeling method was developed to overcome this difficulty. The uncertainties induced by modeling error and measurement noise are the major difficulties of vibration-based damage detection methods. In the proposed methodology, Bayesian probabilistic approach is adopted to explicitly address the uncertainties associated with the identified model parameters. In the model updating process, the stiffness of the ballast foundation is assumed to be continuous along the sleeper by using a polynomial of order N. One of the contributions of this paper is to calculate the order N conditional on a given set of measurement utilizing the Bayesian model class selection method. The proposed ballast damage detection methodology was verified with vibration data obtained from a segment of full-scale ballasted track under laboratory conditions, and the experimental verification results are very encouraging showing that it is possible to use the Bayesian approach along with the newly developed continuous modeling method for the purpose of ballast damage detection.


2001 ◽  
Vol 58 (8) ◽  
pp. 1663-1671 ◽  
Author(s):  
Milo D Adkison ◽  
Zhenming Su

In this simulation study, we compared the performance of a hierarchical Bayesian approach for estimating salmon escapement from count data with that of separate maximum likelihood estimation of each year's escapement. We simulated several contrasting counting schedules resulting in data sets that differed in information content. In particular, we were interested in the ability of the Bayesian approach to estimate escapement and timing in years where few or no counts are made after the peak of escapement. We found that the Bayesian hierarchical approach was much better able to estimate escapement and escapement timing in these situations. Separate estimates for such years could be wildly inaccurate. However, even a single postpeak count could dramatically improve the estimability of escapement parameters.


2019 ◽  
Vol 18 (4) ◽  
pp. 275-294 ◽  
Author(s):  
Christian Dahlman ◽  
Anne Ruth Mackor

Abstract The authors investigate to what extent an evaluation of legal evidence in terms of coherence (suggested by Thagard, Amaya, Van Koppen and others) is reconcilable with a probabilistic (Bayesian) approach to legal evidence. The article is written by one author (Dahlman) with a background in the bayesian approach to legal evidence, and one author (Mackor) with a background in scenario theory. The authors find common ground but partly diverge in their conclusions. Their findings give support to the claim (reductionism) that coherence can be translated into probability without loss. Dahlman therefore concludes that the probabilistic vocabulary is superior to the coherence vocabulary, since it is more precise. Mackor is more agnostic in her conclusions about reductionism. In Mackor's view, the findings of their joint investigation do not imply that the probabilistic approach is superior to the coherentist approach.


1978 ◽  
Vol 10 (10) ◽  
pp. 1101-1119 ◽  
Author(s):  
J O Huff ◽  
W A V Clark

A model of the probability of moving which incorporates aspects of the independent-trials process, the stage in the life cycle, and the concept of cumulative inertia is formulated. The model is based on the interaction of two forces. On the one hand there is a certain resistance to moving (cumulative inertia) and on the other the household may be dissatisfied with certain attributes of the present dwelling and its surroundings (residential stress). The probability of moving is a function of the resultant of these two conflicting forces. The model is designed not only to predict who will move (those individuals with high residential stress relative to their resistance to moving), but also to predict how an individual's probability of moving is likely to change over time. Some simple and limited simulations suggest that the model will capture rather well the different kinds of mobility rates which are observed from empirical data sets.


2014 ◽  
Vol 8 (2) ◽  
pp. 217-233 ◽  
Author(s):  
Weihong Ni ◽  
Corina Constantinescu ◽  
Athanasios A. Pantelous

AbstractOne of the pricing strategies for Bonus–Malus (BM) systems relies on the decomposition of the claims’ randomness into one part accounting for claims’ frequency and the other part for claims’ severity. By mixing an exponential with a Lévy distribution, we focus on modelling the claim severity component as a Weibull distribution. For a Negative Binomial number of claims, we employ the Bayesian approach to derive the BM premiums for Weibull severities. We then conclude by comparing our explicit formulas and numerical results with those for Pareto severities that were introduced by Frangos & Vrontos.


Sign in / Sign up

Export Citation Format

Share Document