Adaptive Virtual Support Vector Machine for the Reliability Analysis of High-Dimensional Problems

Author(s):  
Hyeongjin Song ◽  
K. K. Choi ◽  
Ikjin Lee ◽  
Liang Zhao ◽  
David Lamb

In this study, an efficient classification methodology is developed for reliability analysis while maintaining the accuracy level similar to or better than existing response surface methods. The sampling-based reliability analysis requires only the classification information — a success or a failure – but the response surface methods provide real function values as their output, which requires more computational effort. The problem is even more challenging to deal with high-dimensional problems due to the curse of dimensionality. In the newly proposed virtual support vector machine (VSVM), virtual samples are generated near the limit state function by using linear or Kriging-based approximations. The exact function values are used for approximations of virtual samples to improve accuracy of the resulting VSVM decision function. By introducing the virtual samples, VSVM can overcome the deficiency in existing classification methods where only classified function values are used as their input. The universal Kriging method is used to obtain virtual samples to improve the accuracy of the decision function for highly nonlinear problems. A sequential sampling strategy that chooses a new sample near the true limit state function is integrated with VSVM to maximize the accuracy. Examples show the proposed adaptive VSVM yields better efficiency in terms of the modeling time and the number of required samples while maintaining similar level or better accuracy especially for high-dimensional problems.

2007 ◽  
Vol 353-358 ◽  
pp. 1009-1012
Author(s):  
Chao Ma ◽  
Zhen Zhou Lu

For reliability analysis of structure with implicit limit state function, an iterative algorithm is presented on the basis of support vector classification machine. In the present method, the support vector classification machine is employed to construct surrogate of the implicit limit state function. By use of the proposed rational iteration and sampling procedure, the constructed support vector classification machine can converge to the actual limit state function at the important region, which contributes to the failure probability significantly. Then the precision of the reliability analysis is improved. The implementation of the presented method is given in detail, and the feasibility and the efficiency are demonstrated by the illustrations.


Author(s):  
Hyeongjin Song ◽  
K. K. Choi ◽  
Ikjin Lee ◽  
Liang Zhao ◽  
David Lamb

In this paper, a sampling-based RBDO method using a classification method is presented. The probabilistic sensitivity analysis is used to compute sensitivities of probabilistic constraints with respect to random variables. Since the probabilistic sensitivity analysis requires only the limit state function, and not the response surface or sensitivity of the response, an efficient classification method can be used for a sampling-based RBDO. The proposed virtual support vector machine (VSVM), which is a classification method, is a support vector machine (SVM) with virtual samples. By introducing virtual samples, VSVM overcomes the deficiency in existing SVM that uses only classification information as their input. In this paper, the universal Kriging method is used to obtain locations of virtual samples to improve the accuracy of the limit state function for highly nonlinear problems. A sequential sampling strategy effectively inserts new samples near the limit state function. In sampling-based RBDO, Monte Carlo simulation (MCS) is used for the reliability analysis and probabilistic sensitivity analysis. Since SVM is an explicit classification method, unlike implicit methods, computational cost for evaluating a large number of MCS samples can be significantly reduced. Several efficiency strategies, such as the hyper-spherical local window for generation of the limit state function and the Transformations/Gibbs sampling method to generate uniform samples in the hyper-sphere, are also applied. Examples show that the proposed sampling-based RBDO using VSVM yields better efficiency in terms of the number of required samples and the computational cost for evaluating MCS samples while maintaining accuracy similar to that of sampling-based RBDO using the implicit dynamic Kriging (D-Kriging) method.


2015 ◽  
Vol 2015 ◽  
pp. 1-14 ◽  
Author(s):  
Yu Wang ◽  
Xiongqing Yu ◽  
Xiaoping Du

A new reliability-based design optimization (RBDO) method based on support vector machines (SVM) and the Most Probable Point (MPP) is proposed in this work. SVM is used to create a surrogate model of the limit-state function at the MPP with the gradient information in the reliability analysis. This guarantees that the surrogate model not only passes through the MPP but also is tangent to the limit-state function at the MPP. Then, importance sampling (IS) is used to calculate the probability of failure based on the surrogate model. This treatment significantly improves the accuracy of reliability analysis. For RBDO, the Sequential Optimization and Reliability Assessment (SORA) is employed as well, which decouples deterministic optimization from the reliability analysis. The improved SVM-based reliability analysis is used to amend the error from linear approximation for limit-state function in SORA. A mathematical example and a simplified aircraft wing design demonstrate that the improved SVM-based reliability analysis is more accurate than FORM and needs less training points than the Monte Carlo simulation and that the proposed optimization strategy is efficient.


Author(s):  
Zhaoyin Shi ◽  
Zhenzhou Lu ◽  
Xiaobo Zhang ◽  
Luyi Li

For the structural reliability analysis, although many methods have been proposed, they still suffer from substantial computational cost or slow convergence rate for complex structures, the limit state function of which are highly non-linear, high dimensional, or implicit. A novel adaptive surrogate model method is proposed by combining support vector machine (SVM) and Monte Carlo simulation (MCS) to improve the computational efficiency of estimating structural failure probability in this paper. In the proposed method, a new adaptive learning method is established based on the kernel function of the SVM, and a new stop criterion is constructed by measuring the relative position between sample points and the margin of SVM. Then, MCS is employed to estimate failure probability based on the convergent SVM model instead of the actual limit state function. Due to the introduction of adaptive learning function, the effectiveness of the proposed method is significantly higher than those that employed random training set to construct the SVM model only once. Compared with the existing adaptive SVM combined with MCS, the proposed method avoids information loss caused by inconsistent distance scales and the normalization of the learning function, and the proposed convergence criterion is also more concise than that employed in the existing method. The examples in the paper show that the proposed method is more efficient and has broader applicability than other similar surrogate methods.


Author(s):  
Zequn Wang ◽  
Mingyang Li

Abstract Conventional uncertainty quantification methods usually lacks the capability of dealing with high-dimensional problems due to the curse of dimensionality. This paper presents a semi-supervised learning framework for dimension reduction and reliability analysis. An autoencoder is first adopted for mapping the high-dimensional space into a low-dimensional latent space, which contains a distinguishable failure surface. Then a deep feedforward neural network (DFN) is utilized to learn the mapping relationship and reconstruct the latent space, while the Gaussian process (GP) modeling technique is used to build the surrogate model of the transformed limit state function. During the training process of the DFN, the discrepancy between the actual and reconstructed latent space is minimized through semi-supervised learning for ensuring the accuracy. Both labeled and unlabeled samples are utilized for defining the loss function of the DFN. Evolutionary algorithm is adopted to train the DFN, then the Monte Carlo simulation method is used for uncertainty quantification and reliability analysis based on the proposed framework. The effectiveness is demonstrated through a mathematical example.


2019 ◽  
Vol 2019 ◽  
pp. 1-11 ◽  
Author(s):  
Bin Hu ◽  
Guo-shao Su ◽  
Jianqing Jiang ◽  
Yilong Xiao

A new response surface method (RSM) for slope reliability analysis was proposed based on Gaussian process (GP) machine learning technology. The method involves the approximation of limit state function by the trained GP model and estimation of failure probability using the first-order reliability method (FORM). A small amount of training samples were firstly built by the limited equilibrium method for training the GP model. Then, the implicit limit state function of slope was approximated by the trained GP model. Thus, the implicit limit state function and its derivatives for slope stability analysis were approximated by the GP model with the explicit formulation. Furthermore, an iterative algorithm was presented to improve the precision of approximation of the limit state function at the region near the design point which contributes significantly to the failure probability. Results of four case studies including one nonslope and three slope problems indicate that the proposed method is more efficient to achieve reasonable accuracy for slope reliability analysis than the traditional RSM.


2017 ◽  
Vol 2017 ◽  
pp. 1-9 ◽  
Author(s):  
Jianguo Zhang ◽  
Jiwei Qiu ◽  
Pidong Wang

This paper presents a novel procedure based on first-order reliability method (FORM) for structural reliability analysis with hybrid variables, that is, random and interval variables. This method can significantly improve the computational efficiency for the abovementioned hybrid reliability analysis (HRA), while generally providing sufficient precision. In the proposed procedure, the hybrid problem is reduced to standard reliability problem with the polar coordinates, where an n-dimensional limit-state function is defined only in terms of two random variables. Firstly, the linear Taylor series is used to approximate the limit-state function around the design point. Subsequently, with the approximation of the n-dimensional limit-state function, the new bidimensional limit state is established by the polar coordinate transformation. And the probability density functions (PDFs) of the two variables can be obtained by the PDFs of random variables and bounds of interval variables. Then, the interval of failure probability is efficiently calculated by the integral method. At last, one simple problem with explicit expressions and one engineering application of spacecraft docking lock are employed to demonstrate the effectiveness of the proposed methods.


2012 ◽  
Vol 544 ◽  
pp. 212-217 ◽  
Author(s):  
Hong Yan Hao ◽  
Hao Bo Qiu ◽  
Zhen Zhong Chen ◽  
Hua Di Xiong

For probabilistic design problems with implicit limit state functions encountered in practical application, it is difficult to perform reliability analysis due to the expensive computational cost. In this paper, a new reliability analysis method which applies support vector machine classification(SVM-C) and adaptive sampling strategy is proposed to improve the efficiency. The SVM-C constructs a model defining the boundary of failure regions which classifies samples as safe or failed using SVM-C, then this model is used to replace the true limit state function,thus reducing the computational cost. The adaptive sampling strategy is applied to select samples along the constraint boundaries. It can also improves the efficiency of the proposed method. In the end, a probability analysis example is presented to prove the feasible and efficient of the proposed method.


Author(s):  
Zhen Hu ◽  
Xiaoping Du

Interval variables are commonly encountered in design, especially in the early design stages when data are limited. Thus, reliability analysis (RA) should deal with both interval and random variables and then predict the lower and upper bounds of reliability. The analysis is computationally intensive, because the global extreme values of a limit-state function with respect to interval variables must be obtained during the RA. In this work, a random field approach is proposed to reduce the computational cost with two major developments. The first development is the treatment of a response variable as a random field, which is spatially correlated at different locations of the interval variables. Equivalent reliability bounds are defined from a random field perspective. The definitions can avoid the direct use of the extreme values of the response. The second development is the employment of the first-order reliability method (FORM) to verify the feasibility of the random field modeling. This development results in a new random field method based on FORM. The new method converts a general response variable into a Gaussian field at its limit state and then builds surrogate models for the autocorrelation function and reliability index function with respect to interval variables. Then, Monte Carlo simulation is employed to estimate the reliability bounds without calling the original limit-state function. Good efficiency and accuracy are demonstrated through three examples.


Sign in / Sign up

Export Citation Format

Share Document