Design Curve Construction Based on Monte Carlo Simulation

Author(s):  
Zhigang Wei ◽  
Limin Luo ◽  
Burt Lin ◽  
Dmitri Konson ◽  
Kamran Nikbin

Good durability/reliability performance of products can be achieved by properly constructing and implementing design curves, which are usually obtained by analyzing test data, such as fatigue S-N data. A good design curve construction approach should consider sample size, failure probability and confidence level, and these features are especially critical when test sample size is small. The authors have developed a design S-N curve construction method based on the tolerance limit concept. However, recent studies have shown that the analytical solutions based on the tolerance limit approach may not be accurate for very small sample size because of the assumptions and approximations introduced to the analytical approach. In this paper a Monte Carlo simulation approach is used to construct design curves for test data with an assumed underlining normal (or lognormal) distribution. The difference of factor K, which measures the confidence level of the test data, between the analytical solution and the Monte Carlo simulation solutions is compared. Finally, the design curves constructed based on these methods are demonstrated and compared using fatigue S-N data with small sample size.

2021 ◽  
Author(s):  
Andrea Madella ◽  
Christoph Glotzbach ◽  
Todd A. Ehlers

Abstract. Detrital tracer thermochronology exploits the relationship between bedrock thermochronometric age-elevation profiles and a distribution of detrital grain-ages collected from river, glacial, or other sediment to study spatial changes in the distribution of catchment erosion. If ages increase linearly with elevation, spatially uniform erosion is expected to yield a detrital age distribution that mirrors the catchment's hypsometric curve. Alternatively, a mismatch between detrital and hypsometric distributions may indicate non-uniform erosion within a catchment. For studies seeking to identify the pattern of erosion, measured grain-age populations rarely exceed 100 grains due largely to the time and costs related to individual measurements. With sample sizes of this order, discerning between two detrital age distributions produced by different catchment erosion scenarios can be difficult at a high statistical confidence level. However, there is no established method to quantify the sample-size-dependent uncertainty inherent to detrital tracer thermochronology, and practitioners are often left wondering how many grains is enough?. Here, we investigate how sample size affects the uncertainty of detrital age distributions and how such uncertainty affects the ability to uniquely infer the erosional pattern of the upstream area. We do this using the Kolmogorov-Smirnov statistic as metric of dissimilarity among distributions, based on which the statistical confidence of detecting an erosional pattern is determined through Monte Carlo sampling. The techniques are implemented in a new tool (ESD_thermotrace) to consistently report confidence levels as a function of sample size and application-specific variables. The proposed tool is made available as a new open-source Python-based script along with test data. Testing between different hypothesized erosion scenarios with this tool provides thermochronologists with the minimum sample size (i.e. number of bedrock and detrital grain-ages) required to answer their specific scientific question, at their desired level of statistical confidence. Furthermore, in cases of unavoidably small sample size (e.g., due to poor grain quality or low sample volume), we provide a means to calculate the confidence level of interpretations made from the data.


2011 ◽  
Vol 103 ◽  
pp. 366-371 ◽  
Author(s):  
Wei Hong Zhong ◽  
Xiu Shui Ma ◽  
Ying Dao Li ◽  
Yuan Li

In a contact measurement process, the coordinate measuring machine(CMM)probe will bring dynamic measurement error, therefore, dynamic calibration of the probe tip effective diameter should to be done at different probing speeds, and calibration uncertainty should to be given. There are some problems, slow convergence and unstable, using Monte Carlo (MC) method in uncertainty. In this paper, Quasi Monte Carlo (QMC) method is presented in the probe tip effective diameter uncertainty evaluation. At a certain positioning speed and distance approximation, probe tip effective diameter experimental tests are done with changing probing speeds. MC and QMC methods are used on uncertainty evaluation respectively, and the results are compared and analyzed. The simulation shows that QMC can be used on dynamic uncertainty evaluation of CMM probe tip. Compared with MC, QMC obtains a better stability and precision in small sample size and gains higher computing speed in large sample size.显示对应的拉丁字符的拼音 字典名词 assessment动词 assessevaluatepass judgment


2021 ◽  
Vol 63 (4) ◽  
pp. 379-385
Author(s):  
Bin Wang ◽  
Faisal Islam ◽  
Georg W. Mair

Abstract The test data for static burst strength and load cycle fatigue strength of pressure vessels can often be well described by Gaussian normal or Weibull distribution functions. There are various approaches which can be used to determine the parameters of the Weibull distribution function; however, the performance of these methods is uncertain. In this study, six methods are evaluated by using the criterion of OSL (observed significance level) from Anderson-Darling (AD) goodness of Fit (GoF), These are: a) the norm-log based method, b) least squares regression, c) weighted least squares regression, d) a linear approach based on good linear unbiased estimators, e) maximum likelihood estimation and f) method of moments estimation. In addition, various approaches of ranking function are considered. The results show that there are no outperforming methods which can be identified clearly, primarily due to the limitation of the small sample size of the test data used for Weibull analysis. This randomness resulting from the sampling is further investigated by using Monte Carlo simulations, concluding that the sample size of the experimental data is more crucial than the exact method used to derive Weibull parameters. Finally, a recommendation is made to consider the uncertainties of the limitations due to the small size for pressure vessel testing and also for general material testing.


2021 ◽  
Vol 13 (2) ◽  
pp. 168781402199215
Author(s):  
Xiuhong Hao ◽  
Shuqiang Wang ◽  
Panqiang Huo ◽  
Deng Pan

To address the issues of long testing periods and small sample sizes while evaluating the service life of heavy-load self-lubricating liners, we propose a succinct method based on Monte Carlo simulation that is significantly fast and requires a small sample size. First, the support vector regression algorithm was applied to fit the degradation trajectories of the wear depth, and the first and second characteristic parameter vectors of the wear depth as well as the corresponding distribution models were obtained. Next, sample expansion was performed using Monte Carlo simulation and the inverse transform method. Finally, based on the failure criterion of the self-lubricating liner, the service lives and distribution models of the expanded samples were obtained; subsequently, the corresponding reliability life indices were provided. Our results indicate that when the expanded sample was large enough, the proposed prediction method exhibited a relatively high prediction accuracy. Therefore, these results provide theoretical support for shortening the testing cycle used to evaluate the service life of self-lubricating liners and for accelerating the research and development of self-lubricating spherical plain bearing products.


2007 ◽  
Vol 129 (1) ◽  
pp. 159-168 ◽  
Author(s):  
Gerald T. Cashman

A methodology is presented for the preparation of a competing modes fatigue design curve. Examples using extruded and isothermally forged powder metallurgy René 95 are presented. Monte Carlo simulation results are given to identify sample size requirements appropriate for the generation of reliable design curves.


2013 ◽  
Vol 2 (1) ◽  
pp. 97-113 ◽  
Author(s):  
Ahmed M. Fouad ◽  
Mohamed Saleh ◽  
Amir F. Atiya

In this paper, a novel algorithm is proposed for sampling from discrete probability distributions using the probability proportional to size sampling method, which is a special case of Quota sampling method. The motivation for this study is to devise an efficient sampling algorithm that can be used in stochastic optimization problems -- when there is a need to minimize the sample size. Several experiments have been conducted to compare the proposed algorithm with two widely used sample generation methods, the Monte Carlo using inverse transform, and quasi-Monte Carlo algorithms. The proposed algorithm gave better accuracy than these methods, and in terms of time complexity it is nearly of the same order.


Sign in / Sign up

Export Citation Format

Share Document