scholarly journals Tunnel Probabilistic Structural Analysis Using the FORM

2015 ◽  
Vol 2015 ◽  
pp. 1-9
Author(s):  
Yousef Mirzaeian ◽  
Kourosh Shahriar ◽  
Mostafa Sharifzadeh

In this paper tunnel probabilistic structural analysis (TuPSA) was performed using the first order reliability method (FORM). In TuPSA, a tunnel performance function is defined according to the boundary between the structural stability and instability. Then the performance function is transformed from original space into the standard normal variable space to obtain the design point, reliability index, and also the probability of tunnel failure. In this method, it is possible to consider the design factors as the dependent or independent random parameters with arbitrary probability distributions. A software code is developed to perform the tunnel probabilistic structural analysis (TuPSA) using the FORM. For validation and verification of TuPSA, a typical tunnel example with random joints orientations as well as mechanical properties has been studied. The results of TuPSA were compared with those obtained from Monte-Carlo simulation. The results show, in spite of deterministic analysis which indicates that the rock blocks are stable, that TuPSA resulted in key-blocks failure with certain probabilities. Comparison between probabilistic and deterministic analyses results indicates that probabilistic results, including the design point and probability of failure, are more rational than deterministic factor of safety.

Author(s):  
Qian Wang ◽  
Erica Jarosch ◽  
Hongbing Fang

In practical engineering problems, numerical analyses using the finite element (FE) method or other methods are generally required to evaluate system responses including stresses and deformations. For problems involving expensive FE analyses, it is not efficient or straightforward to directly apply conventional sampling-based or gradient-based reliability analysis approaches. To reduce computational efforts, it is useful to develop efficient and accurate metamodeling techniques to replace the original FE analyses. In this work, an adaptive metamodeling technique and a First-Order Reliability Method (FORM) were integrated. In each adaptive iteration, a compactly supported radial basis function (RBF) was adopted and a metamodel was created to explicitly express a performance function. An alternate FORM was implemented to calculate reliability index of the current iteration. Based on the design point, additional samples were generated and added to the existing sample points to re-generate the metamodel. The accuracy of the RBF metamodel could be improved in the neighborhood of the design point at each iteration. This procedure continued until the convergence of the reliability analysis results was achieved. A numerical example was studied. The proposed adaptive approach worked well and reliability analysis results were found with a reasonable number of iterations.


2012 ◽  
Vol 134 (9) ◽  
Author(s):  
Xiaoping Du ◽  
Zhen Hu

In many engineering applications, the probability distributions of some random variables are truncated; these truncated distributions are resulted from restricting the domain of other probability distributions. If the first order reliability method (FORM) is directly used, the truncated random variables will be transformed into unbounded standard normal distributions. This treatment may result in large errors in reliability analysis. In this work, we modify FORM so that the truncated random variables are transformed into truncated standard normal variables. After the first order approximation and variable transformation, saddlepoint approximation is then used to estimate the reliability. Without increasing the computational cost, the proposed method is generally more accurate than the original FORM for problems with truncated random variables.


Author(s):  
Jan Mathisen ◽  
Siril Okkenhaug ◽  
Kjell Larsen

A joint probabilistic model of the metocean environment is assembled, taking account of wind, wave and current and their respective heading angles. Mooring line tensions are computed in the time domain, for a large set of short-term stationary conditions, intended to span the domain of metocean conditions that contribute significantly to the probabilities of high tensions. Weibull probability distributions are fitted to local tension maxima extracted from each time series. Long time series of 30 hours duration are used to reduce statistical uncertainty. Short-term, Gumbel extreme value distributions of line tension are derived from the maxima distributions. A response surface is fitted to the distribution parameters for line tension, to allow interpolation between the metocean conditions that have been explicitly analysed. A second order reliability method is applied to integrate the short-term tension distributions over the probability of the metocean conditions and obtain the annual extreme value distribution of line tension. Results are given for the most heavily loaded mooring line in two mooring systems: a mobile drilling unit and a production platform. The effects of different assumptions concerning the distribution of wave heading angles in simplified analysis for mooring line design are quantified by comparison with the detailed calculations.


2015 ◽  
Vol 15 (01) ◽  
pp. 1450034 ◽  
Author(s):  
Xin-Dang He ◽  
Wen-Xuan Gou ◽  
Yong-Shou Liu ◽  
Zong-Zhan Gao

Using the convex model approach, the bounds of uncertain variables are only required rather than the precise probability distributions, based on which it can be made possible to conduct the reliability analysis for many complex engineering problems with limited information. This paper aims to develop a novel nonprobabilistic reliability solution method for structures with interval uncertainty variables. In order to explore the entire domain represented by interval variables, an enhanced optimal Latin hypercube sampling (EOLHS) is used to reduce the computational effort considerably. Through the proposed method, the safety degree of a structure with convex modal uncertainty can be quantitatively evaluated. More importantly, this method can be used to deal with any general problems with nonlinear and black-box performance functions. By introducing the suggested reliability method, a convex-model-based system reliability method is also formulated. Three numerical examples are investigated to demonstrate the efficiency and accuracy of the method.


2015 ◽  
Vol 36 (1) ◽  
pp. 91-104
Author(s):  
Dariusz Ampuła

Abstract The way carry out of analysis concerning correctness working of evaluation module proposed in functioning research methodology, chosen features elements of artillery fuses was presented in the article. Probability distributions of the aptitude time of tested ammunition elements were applied for verification of undertaken post diagnostic decisions. The analysis of test results chosen fuses elements, based on the standard normal distributions were executed, further the graphic interpretations of these distributions were made. A measurement of the strength of resistance decline of the side-bolt spring was chosen for the analysis, as a way of checking MG-37 and MG-57 fuse types features. Furthermore, the author presents an illustrative comparison of normal distributions, which confirms that post diagnostic decisions had been undertaken correctly. The graphic interpretations of analyzed test results of MG-37 fuse elements type were executed by means of two-parameters gamma distribution in the comparison. Concise conclusions confirming the correctness of functioning of evaluation module in the research methodology were introduced at the end of the article.


Author(s):  
Franck Massa ◽  
Karine Mourier-Ruffin ◽  
Bertrand Lallemand ◽  
Thierry Tison

Finite element simulations are well established in industry and are an essential part of the design phase for mechanical structures. Although numerical models have become more and more complex and realistic, the results can still be relatively far from observed reality. Nowadays, use of deterministic analysis is limited due to the existence of several kinds of imperfections in the different steps of the structural design process. This paper presents a general non-probabilistic methodology that uses interval sets to propagate the imperfections. This methodology incorporates sensitivity analysis and reanalysis techniques. Numerical interval results for a test case were compared to experimental interval results to demonstrate the capabilities of the proposed methodology.


2018 ◽  
Vol 12 (1) ◽  
pp. 96-107 ◽  
Author(s):  
Hongbo Zhao ◽  
Changxing Zhu ◽  
Zhongliang Ru

Introduction:Reliability analysis is a good tool to deal with the uncertainty and has been widely used in the engineering system. The first order reliability method (FORM) is generally used to calculate the reliability index but FORM is time-consuming and requires derivative computing.Methods:Artificial Bee Colony (ABC) algorithm is a very simple, robust and population-based stochastic optimization algorithm. In this study, an ABC-based reliability analysis was proposed to calculate the reliability index of engineering system through combining Artificial Bee Colony (ABC) algorithm with FORM. FORM was adopted to calculate the reliability index and design point. ABC is used to solve the constrained optimization about FORM. The procedure of ABC-based reliability analysis was presented in detail.Results and Conclusion:The proposed method was verified by two classic examples and then applied to geotechnical engineering. The results show that the ABC algorithm can effectively solve the global optimization problem in FORM. Results demonstrate that ABC-based reliability analysis is a good approach to obtain the reliability index and design point with a good accuracy so that it can be applied to analyze the reliability of a complex engineering system.


Author(s):  
Ungki Lee ◽  
Ikjin Lee

Abstract Reliability analysis that evaluates a probabilistic constraint is an important part of reliability-based design optimization (RBDO). Inverse reliability analysis evaluates the percentile value of the performance function that satisfies the reliability. To compute the percentile value, analytical methods, surrogate model based methods, and sampling-based methods are commonly used. In case the dimension or nonlinearity of the performance function is high, sampling-based methods such as Monte Carlo simulation, Latin hypercube sampling, and importance sampling can be directly used for reliability analysis since no analytical formulation or surrogate model is required in these methods. The sampling-based methods have high accuracy but require a large number of samples, which can be very time-consuming. Therefore, this paper proposes methods that can improve the accuracy of reliability analysis when the number of samples is not enough and the sampling-based methods are considered to be better candidates. This study starts with the idea of training the relationship between the realization of the performance function at a small sample size and the corresponding true percentile value of the performance function. Deep feedforward neural network (DFNN), which is one of the promising artificial neural network models that approximates high dimensional models using deep layered structures, is trained using the realization of various performance functions at a small sample size and the corresponding true percentile values as input and target training data, respectively. In this study, various polynomial functions and random variables are used to create training data sets consisting of various realizations and corresponding true percentile values. A method that approximates the realization of the performance function through kernel density estimation and trains the DFNN with the discrete points representing the shape of the kernel distribution to reduce the dimension of the training input data is also presented. Along with the proposed reliability analysis methods, a strategy that reuses samples of the previous design point to enhance the efficiency of the percentile value estimation is explained. The results show that the reliability analysis using the DFNN is more accurate than the method using only samples. In addition, compared to the method that trains the DFNN using the realization of the performance function, the method that trains the DFNN with the discrete points representing the shape of the kernel distribution improves the accuracy of reliability analysis and reduces the training time. The proposed sample reuse strategy is verified that the burden of function evaluation at the new design point can be reduced by reusing the samples of the previous design point when the design point changes while performing RBDO.


Author(s):  
Robert G. Tryon ◽  
Animesh Dey ◽  
Richard A. Holmes ◽  
Ganapathi Krishnan

Three case studies are presented in which computational-based methodologies have been used to assess structural reliability in the aerospace industry. The studies involve hot section turbine disks of a helicopter engine, fan blades of a commercial airline engine and bearings in an auxiliary power unit. In all cases, the results of the computational models were used to support the certification process for design and application changes. The statistical variation in design and usage parameters including geometry, materials, speed, temperature and other environmental factors are considered. The response surface approach was used to construct a durability performance function. This performance function is used with the first order reliability method (FORM) to determine the probability of failure and the sensitivity of the failure to the design and usage parameters. A hybrid combination of perturbation analysis and Monte Carlo simulation is used to incorporate time dependent random variables. System reliability is used to determine the system probability of failure, and the sensitivity of the system durability to the design and usage parameters.


Sign in / Sign up

Export Citation Format

Share Document