scholarly journals A Decoupling Strategy for Reliability Analysis of Multidisciplinary System with Aleatory and Epistemic Uncertainties

2021 ◽  
Vol 11 (15) ◽  
pp. 7008
Author(s):  
Chao Fu ◽  
Jihong Liu ◽  
Wenting Xu

In reliability-based multidisciplinary design optimization, both aleatory and epistemic uncertainties may exist in multidisciplinary systems simultaneously. The uncertainty propagation through coupled subsystems makes multidisciplinary reliability analysis computationally expensive. In order to improve the efficiency of multidisciplinary reliability analysis under aleatory and epistemic uncertainties, a comprehensive reliability index that has clear geometric meaning under multisource uncertainties is proposed. Based on the comprehensive reliability index, a sequential multidisciplinary reliability analysis method is presented. The method provides a decoupling strategy based on performance measure approach (PMA), probability theory and convex model. In this strategy, the probabilistic analysis and convex analysis are decoupled from each other and performed sequentially. The probabilistic reliability analysis is implemented sequentially based on the concurrent subspace optimization (CSSO) and PMA, and the non-probabilistic reliability analysis is replaced by convex model extreme value analysis, which improves the efficiency of multidisciplinary reliability analysis with aleatory and epistemic uncertainties. A mathematical example and an engineering application are demonstrated to verify the effectiveness of the proposed method.

2014 ◽  
Vol 136 (3) ◽  
Author(s):  
C. Jiang ◽  
G. Y. Lu ◽  
X. Han ◽  
R. G. Bi

Compared with the probability model, the convex model approach only requires the bound information on the uncertainty, and can make it possible to conduct the reliability analysis for many complex engineering problems with limited samples. Presently, by introducing the well-established techniques in probability-based reliability analysis, some methods have been successfully developed for convex model reliability. This paper aims to reveal some different phenomena and furthermore some severe paradoxes when extending the widely used first-order reliability method (FORM) into the convex model problems, and whereby provide some useful suggestions and guidelines for convex-model-based reliability analysis. Two FORM-type approximations, namely, the mean-value method and the design-point method, are formulated to efficiently compute the nonprobabilistic reliability index. A comparison is then conducted between these two methods, and some important phenomena different from the traditional FORMs are summarized. The nonprobabilistic reliability index is also extended to treat the system reliability, and some unexpected paradoxes are found through two numerical examples.


Author(s):  
Zhen Hu ◽  
Sankaran Mahadevan

Multidisciplinary systems will remain in transient states when time-dependent interactions are present among the coupling variables. This brings significant challenges to time-dependent multidisciplinary system reliability analysis. This paper develops an adaptive surrogate modeling approach (ASMA) for multidisciplinary system reliability analysis under time-dependent uncertainty. The proposed framework consists of three modules, namely initialization, uncertainty propagation, and three-level global sensitivity analysis (GSA). The first two modules check the quality of the surrogate models and determine when and where we should refine the surrogate models. Approaches are then proposed to estimate the potential error of the failure probability estimate and determine the location of the new training point. In the third module (i.e. three-level GSA), a method is developed to decide which surrogate model to refine, through GSA at three different levels. These three modules are integrated together systematically and enable us to adaptively allocate the computational resources to refine different surrogate models in the system and thus achieve high accuracy and efficiency in time-dependent multidisciplinary system reliability analysis. Results of two numerical examples demonstrate the effectiveness of the proposed framework.


2016 ◽  
Vol 138 (7) ◽  
Author(s):  
Po Ting Lin ◽  
Shu-Ping Lin

Reliability-based design optimization (RBDO) algorithms have been developed to solve design optimization problems with existence of uncertainties. Traditionally, the original random design space is transformed to the standard normal design space, where the reliability index can be measured in a standardized unit. In the standard normal design space, the modified reliability index approach (MRIA) measured the minimum distance from the design point to the failure region to represent the reliability index; on the other hand, the performance measure approach (PMA) performed inverse reliability analysis to evaluate the target function performance in a distance of reliability index away from the design point. MRIA was able to provide stable and accurate reliability analysis while PMA showed greater efficiency and was widely used in various engineering applications. However, the existing methods cannot properly perform reliability analysis in the standard normal design space if the transformation to the standard normal space does not exist or is difficult to determine. To this end, a new algorithm, ensemble of Gaussian reliability analyses (EoGRA), was developed to estimate the failure probability using Gaussian-based kernel density estimation (KDE) in the original design space. The probabilistic constraints were formulated based on each kernel reliability analysis for the optimization processes. This paper proposed an efficient way to estimate the constraint gradient and linearly approximate the probabilistic constraints with fewer function evaluations (FEs). Some numerical examples with various random distributions are studied to investigate the numerical performances of the proposed method. The results showed that EoGRA is capable of finding correct solutions in some problems that cannot be solved by traditional methods. Furthermore, experiments of image processing with arbitrarily distributed photo pixels are performed. The lighting of image pixels is maximized subject to the acceptable limit. Our implementation showed that the accuracy of the estimation of normal distribution is poor while the proposed method is capable of finding the optimal solution with acceptable accuracy.


2011 ◽  
Vol 243-249 ◽  
pp. 5717-5726
Author(s):  
Ping Yi

In a reliability-based design optimization (RBDO) problem, most of the computations are used for probabilistic constraints assessment, i.e., reliability analysis. Therefore, the effectiveness, especially the correctness of the reliability analysis is very important. If the probabilistic constraint is misjudged, the optimization iteration would have convergence problems or arrive at erratic solutions. The probabilistic constraint assessment can be carried out using either the conventional reliability index approach (RIA) or the performance measure approach (PMA). In this paper, the mathematical models to calculate the reliability index in RIA and to calculate the probabilistic performance measure (PPM) in PMA are discussed. In RIA, through estimating whether the mean-value point in safe domain or not, we should use a positive or negative reliability index respectively. In PMA, one should always minimize the performance measure to compute PPM whether the performance measure at the mean-value point is positive or negative, which puts right the wrong mathematical model in some literatures and makes it possible to produce effective and efficient approach for RBDO.


Author(s):  
Andrew J. Grime ◽  
R. S. Langley

Current design codes for floating offshore structures are based on measures of short-term reliability. That is, a design storm is selected via an extreme value analysis of the environmental conditions and the reliability of the vessel in that design storm is computed. Although this approach yields valuable information on the vessel motions, it does not produce a statistically rigorous assessment of the lifetime probability of failure. An alternative approach is to perform a long-term reliability analysis in which consideration is taken of all sea states potentially encountered by the vessel during the design life. Although permitted as a design approach in current design codes, the associated computational expense generally prevents its use in practice. A new efficient approach to long-term reliability analysis is presented here, the results of which are compared with a traditional short-term analysis for the surge motion of a representative moored FPSO in head seas. This serves to illustrate the failure probabilities actually embedded within current design code methods, and the way in which design methods might be adapted to achieve a specified target safety level.


2021 ◽  
Vol 12 (1) ◽  
pp. 2
Author(s):  
Xiaoya Bian ◽  
Jiawei Chen ◽  
Xixuan Bai ◽  
Kunpeng Zheng

Driven-pile setup is referred to a phenomenon in which the bearing capacity of driven piles increases with time after the end of driving (EOD). The setup effect can significantly improve the bearing capacity (ultimate resistance) of driven piles after initial installation, especially the ultimate shaft resistance. Based on the reliability theory and considering the setup effects of driven piles, this article presents an increase factor (Msetup) for the ultimate resistance of driven piles to modify the reliability index calculation formula. At the same time, the correlation between R0 and Rsetup is comprehensively considered in the reliability index calculation. Next, the uncertainty analysis of load and resistance is conducted to determine the ranges of relevant parameters. Meanwhile, the influence of four critical parameters (factor of safety FOS, the ratio of dead load to live load ρ = QD/QL, Msetup, the correlation coefficient between R0 and Rsetup, and ρR0,Rsetup) on reliability index are analyzed. This parametric study indicates that ρ has a slight influence on the reliability index. However, the reliability index is significantly influenced by FOS, Msetup, and ρR0,Rsetup. Finally, by comparisons with the existing results, it is concluded that the formula proposed in this study is reasonable, and more uncertainties are considered to make the calculated reliability index closer to a practical engineering application. The presented formula clearly expresses the incorporation of the pile setup effect into reliability index calculation, and it is conducive to improving the prediction accuracy of the design capacity of driven piles. Therefore, the reliability analysis of driven piles considering setup effects will present a theoretical basis for the application of driven piles in engineering practice.


Author(s):  
Po Ting Lin ◽  
Shu-Ping Lin

Reliability-Based Design Optimization (RBDO) algorithms have been developed to solve design optimization problems with existence of uncertainties. Traditionally, the original random design space is transformed to the standard normal design space, where the reliability index can be measured in a standardized unit. In the standard normal design space, the Modified Reliability Index Approach (MRIA) measured the minimum distance from the design point to the failure region to represent the reliability index; on the other hand, the Performance Measure Approach (PMA) performed inverse reliability analysis to evaluate the target function performance in a distance of reliability index away from the design point. MRIA was able to provide stable and accurate reliability analysis while PMA showed greater efficiency and was widely used in various engineering applications. However, the existing methods cannot properly perform reliability analysis in the standard normal design space if the transformation to the standard normal space does not exist or is difficult to determine. To this end, a new algorithm, Ensemble of Gaussian Reliability Analyses (EoGRA), was developed to estimate the failure probability using Gaussian-based Kernel Density Estimation (KDE) in the original design space. The probabilistic constraints were formulated based on each kernel reliability analysis for the optimization processes. This paper proposed an efficient way to estimate the constraint gradient and linearly approximate the probabilistic constraints with fewer function evaluations. Some numerical examples with various random distributions are studied to investigate the numerical performances of the proposed method. The results showed EoGRA is capable of finding correct solutions in some problems that cannot be solved by traditional methods.


2012 ◽  
Vol 446-449 ◽  
pp. 667-671
Author(s):  
Lei Lei Liu

To improve the accuracy of describing the nonlinear behavior of long concrete-filled tubes (CFT) and make the nonlinear reliability analysis method more valuable for engineering application, based on the response surface and nonlinear finite element method, the reliability analysis model for the nonlinear carrying capacity of concrete-filled steel tube structure is made, with the help of consistent mode imperfection method to consider the initial geometric defects of the structure. After a parametric analysis of the diameter-thickness ratio and the slenderness ratio, the application scope is examined. The numerical results show that as the slenderness ratio increases, the influence of the initial geometric defects on the reliability of carrying capacity increases gradually. It is suggested that when the slenderness ratio is bigger than 15, the effect of initial geometric defects on the reliability index should be included. Moreover, when the diameter-thickness ratio is smaller, influence of geometric nonlinearity on the reliability index of the carrying capacity is obvious.


2014 ◽  
Vol 58 (3) ◽  
pp. 193-207 ◽  
Author(s):  
C Photiadou ◽  
MR Jones ◽  
D Keellings ◽  
CF Dewes

Sign in / Sign up

Export Citation Format

Share Document