Confidence-Based Design Optimization Under Data Uncertainty Using Most Probable Point-Based Approach

Author(s):  
Yongsu Jung ◽  
Hyunkyoo Cho ◽  
Ikjin Lee

Abstract An accurate input statistical model has been assumed in most of reliability-based design optimization (RBDO) to concentrate on variability of random variables. However, only limited number of data are available to quantify the input statistical model in practical engineering applications. In other words, irreducible variability and reducible uncertainty due to lack of knowledge exist simultaneously in random design variables. Therefore, the uncertainty in reliability induced by insufficient data has to be accounted for RBDO to guarantee confidence of reliability. The uncertainty of input distributions is successfully propagated to a cumulative distribution function (CDF) of reliability under normality assumptions, but it requires a number of function evaluations in double-loop Monte Carlo simulation (MCS). To tackle this challenge, reliability measure approach (RMA) in confidence-based design optimization (CBDO) is proposed to handle the randomness of reliability following the idea of performance measure approach (PMA) in RBDO. Input distribution parameters are transformed to the standard normal space for most probable point (MPP) search with respect to reliability. Therefore, the reliability is approximated at MPP with respect to input distribution parameters. The proposed CBDO can treat confidence constraints employing the reliability value at the target confidence level that is approximated by MPP in P-space. In conclusion, the proposed method can significantly reduce the number of function evaluations by eliminating outer-loop MCS while maintaining acceptable accuracy.

2008 ◽  
Vol 131 (1) ◽  
Author(s):  
Gordon J. Savage ◽  
Young Kap Son

In this paper, we present a methodology that helps select the distribution parameters in degrading multiresponse systems to improve dependability at the lowest lifetime cost. The dependability measures include both quality (soft failures) and reliability (hard failures). Associated costs of scrap, rework, and warrantee work are included. The key to the approach is the fast and efficient creation of the system cumulative distribution function through a series of time-variant limit-state functions. Probabilities are evaluated by Monte Carlo simulation although the first-order reliability method is a viable alternative. The cost objective function that is common in reliability-based design optimization is expanded to include a lifetime loss of performance cost, herein based on present worth theory (also called present value theory). An optimum design in terms of distribution parameters of the design variables is found via a methodology that involves minimizing cost under performance policy constraints over the lifetime as the system degrades. A case study of an over-run clutch provides the insights and potential of the proposed methodology.


2005 ◽  
Vol 297-300 ◽  
pp. 1901-1906 ◽  
Author(s):  
Seung Jae Min ◽  
Seung Hyun Bang

In the design optimization process design variables are selected in the deterministic way though those have uncertainties in nature. To consider variances in design variables reliability-based design optimization problem is formulated by introducing the probability distribution function. The concept of reliability has been applied to the topology optimization based on a reliability index approach or a performance measure approach. Since these approaches, called double-loop singlevariable approach, requires the nested optimization problem to obtain the most probable point in the probabilistic design domain, the time for the entire process makes the practical use infeasible. In this work, new reliability-based topology optimization method is proposed by utilizing single-loop singlevariable approach, which approximates searching the most probable point analytically, to reduce the time cost and dealing with several constraints to handle practical design requirements. The density method in topology optimization including SLP (Sequential Linear Programming) algorithm is implemented with object-oriented programming. To examine uncertainties in the topology design of a structure, the modulus of elasticity of the material and applied loadings are considered as probabilistic design variables. The results of a design example show that the proposed method provides efficiency curtailing the time for the optimization process and accuracy satisfying the specified reliability.


Author(s):  
TURUNA S. SEECHARAN ◽  
GORDON J. SAVAGE

In design, much research deals with cases where design variables are deterministic thus ignoring possible uncertainties present in manufacturing or environmental conditions. When uncertainty is considered, the design variables follow a particular distribution whose parameters are defined. Probabilistic design aims to reduce the probability of failure of a system by moving the distribution parameters of the design variables. The most popular method to estimate the probability of failure is a Monte Carlo Simulation where, using the distribution parameters, many runs are generated and the number of times the system does not meet specifications is counted. This method, however, can become time-consuming as the mechanistic model developed to model a physical system becomes increasingly complex. From structural reliability theory, the First Order Reliability Method (FORM) is an efficient method to estimate probability and efficiently moves the parameters to reduce failure probability. However, if the mechanistic model is too complex FORM becomes difficult to use. This paper presents a methodology to use approximating functions, called 'metamodels', with FORM to search for a design that minimizes the probability of failure. The method will be applied to three examples and the accuracy and speed of this metamodel-based probabilistic design method will be discussed. The speed and accuracy of three popular metamodels, the response surface model, the Radial Basis Function and the Kriging model are compared. Later, some theory will be presented on how the method can be applied to systems with a dynamic performance measure where the response lifetime is required to computer another performance measure.


Author(s):  
Po Ting Lin ◽  
Wei-Hao Lu ◽  
Shu-Ping Lin

In the past few years, researchers have begun to investigate the existence of arbitrary uncertainties in the design optimization problems. Most traditional reliability-based design optimization (RBDO) methods transform the design space to the standard normal space for reliability analysis but may not work well when the random variables are arbitrarily distributed. It is because that the transformation to the standard normal space cannot be determined or the distribution type is unknown. The methods of Ensemble of Gaussian-based Reliability Analyses (EoGRA) and Ensemble of Gradient-based Transformed Reliability Analyses (EGTRA) have been developed to estimate the joint probability density function using the ensemble of kernel functions. EoGRA performs a series of Gaussian-based kernel reliability analyses and merged them together to compute the reliability of the design point. EGTRA transforms the design space to the single-variate design space toward the constraint gradient, where the kernel reliability analyses become much less costly. In this paper, a series of comprehensive investigations were performed to study the similarities and differences between EoGRA and EGTRA. The results showed that EGTRA performs accurate and effective reliability analyses for both linear and nonlinear problems. When the constraints are highly nonlinear, EGTRA may have little problem but still can be effective in terms of starting from deterministic optimal points. On the other hands, the sensitivity analyses of EoGRA may be ineffective when the random distribution is completely inside the feasible space or infeasible space. However, EoGRA can find acceptable design points when starting from deterministic optimal points. Moreover, EoGRA is capable of delivering estimated failure probability of each constraint during the optimization processes, which may be convenient for some applications.


2018 ◽  
Vol 12 (3) ◽  
pp. 181-187
Author(s):  
M. Erkan Kütük ◽  
L. Canan Dülger

An optimization study with kinetostatic analysis is performed on hybrid seven-bar press mechanism. This study is based on previous studies performed on planar hybrid seven-bar linkage. Dimensional synthesis is performed, and optimum link lengths for the mechanism are found. Optimization study is performed by using genetic algorithm (GA). Genetic Algorithm Toolbox is used with Optimization Toolbox in MATLAB®. The design variables and the constraints are used during design optimization. The objective function is determined and eight precision points are used. A seven-bar linkage system with two degrees of freedom is chosen as an example. Metal stamping operation with a dwell is taken as the case study. Having completed optimization, the kinetostatic analysis is performed. All forces on the links and the crank torques are calculated on the hybrid system with the optimized link lengths


Author(s):  
Rama Subba Reddy Gorla

Heat transfer from a nuclear fuel rod bumper support was computationally simulated by a finite element method and probabilistically evaluated in view of the several uncertainties in the performance parameters. Cumulative distribution functions and sensitivity factors were computed for overall heat transfer rates due to the thermodynamic random variables. These results can be used to identify quickly the most critical design variables in order to optimize the design and to make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in heat transfer and to the identification of both the most critical measurements and the parameters.


2005 ◽  
Vol 297-300 ◽  
pp. 1882-1887
Author(s):  
Tae Hee Lee ◽  
Jung Hun Yoo

In practical design applications, most design variables such as thickness, diameter and material properties are not deterministic but stochastic numbers that can be represented by their mean values with variances because of various uncertainties. When the uncertainties related with design variables and manufacturing process are considered in engineering design, the specified reliability of the design can be achieved by using the so-called reliability based design optimization. Reliability based design optimization takes into account the uncertainties in the design in order to meet the user requirement of the specified reliability while seeking optimal solution. Reliability based design optimization of a real system becomes now an emerging technique to achieve reliability, robustness and safety of the design. It is, however, well known that reliability based design optimization can often have so multiple local optima that it cannot converge into the specified reliability. To overcome this difficulty, barrier function approach in reliability based design optimization is proposed in this research and feasible solution with specified reliability index is always provided if a feasible solution is available. To illustrate the proposed formulation, reliability based design optimization of a bracket design is performed. Advanced mean value method and first order reliability method are employed for reliability analysis and their optimization results are compared with reliability index approach based on the accuracy and efficiency.


2011 ◽  
Vol 18 (2) ◽  
pp. 223-234 ◽  
Author(s):  
R. Haas ◽  
K. Born

Abstract. In this study, a two-step probabilistic downscaling approach is introduced and evaluated. The method is exemplarily applied on precipitation observations in the subtropical mountain environment of the High Atlas in Morocco. The challenge is to deal with a complex terrain, heavily skewed precipitation distributions and a sparse amount of data, both spatial and temporal. In the first step of the approach, a transfer function between distributions of large-scale predictors and of local observations is derived. The aim is to forecast cumulative distribution functions with parameters from known data. In order to interpolate between sites, the second step applies multiple linear regression on distribution parameters of observed data using local topographic information. By combining both steps, a prediction at every point of the investigation area is achieved. Both steps and their combination are assessed by cross-validation and by splitting the available dataset into a trainings- and a validation-subset. Due to the estimated quantiles and probabilities of zero daily precipitation, this approach is found to be adequate for application even in areas with difficult topographic circumstances and low data availability.


Author(s):  
Kisun Song ◽  
Kyung Hak Choo ◽  
Jung-Hyun Kim ◽  
Dimitri N. Mavris

In modern automotive industry market, there have been a lot of state-of-art methodologies to perform a conceptual design of a car; functional methods and 3D scanning technology are widely used. Naturally, the issues frequently boiled down to a trade-off decision making problem between quality and cost. Besides, to incorporate the design method with advanced optimization methodologies such as design-of-experiments (DOE), surrogate modeling, how efficiently a method can morph or recreate a vehicle’s shape is crucial. This paper accomplishes an aerodynamic design optimization of rear shape of a sedan by incorporating a reverse shape design method (RSDM) with the aforementioned methodologies based on CFD analysis for aerodynamic drag reduction. RSDM reversely recovers a 3D geometry of a car from several 2D schematics. The backbone boundary lines of 2D schematic are identified and regressed by appropriate interpolation function and a 3D shape is yielded by a series of simple arithmetic calculations without losing the detail geometric features. Besides, RSDM can parametrize every geometric entity to efficiently manipulate the shape for application to design optimization studies. As the baseline, an Audi A6 is modeled by RSDM and explored through CFD analysis for model validation. Choosing six design variables around the rear shape, 77 design points are created to build neural networks. Finally, a significant amount of CD reduction is obtained and corresponding configuration is validated via CFD.


2014 ◽  
Vol 984-985 ◽  
pp. 419-424
Author(s):  
P. Sabarinath ◽  
M.R. Thansekhar ◽  
R. Saravanan

Arriving optimal solutions is one of the important tasks in engineering design. Many real-world design optimization problems involve multiple conflicting objectives. The design variables are of continuous or discrete in nature. In general, for solving Multi Objective Optimization methods weight method is preferred. In this method, all the objective functions are converted into a single objective function by assigning suitable weights to each objective functions. The main drawback lies in the selection of proper weights. Recently, evolutionary algorithms are used to find the nondominated optimal solutions called as Pareto optimal front in a single run. In recent years, Non-dominated Sorting Genetic Algorithm II (NSGA-II) finds increasing applications in solving multi objective problems comprising of conflicting objectives because of low computational requirements, elitism and parameter-less sharing approach. In this work, we propose a methodology which integrates NSGA-II and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for solving a two bar truss problem. NSGA-II searches for the Pareto set where two bar truss is evaluated in terms of minimizing the weight of the truss and minimizing the total displacement of the joint under the given load. Subsequently, TOPSIS selects the best compromise solution.


Sign in / Sign up

Export Citation Format

Share Document