A Quadrature-Based Sampling Technique for Robust Design With Computer Models

Author(s):  
Daniel D. Frey ◽  
Geoff Reber ◽  
Yiben Lin

Several methods have been proposed for estimating transmitted variance to enable robust parameter design using computer models. This paper presents an alternative technique based on Gaussian quadrature which requires only 2n+1 or 4n+1 samples (depending on the accuracy desired) where n is the number of randomly varying inputs. The quadrature-based technique is assessed using a hierarchical probability model. The 4n+1 quadrature-based technique can estimate transmitted standard deviation within 5% in over 95% of systems which is much better than the accuracy of Hammersley Sequence Sampling, Latin Hypercube Sampling, and the Quadrature Factorial Method under similar resource constraints. If the most accurate existing method, Hammersley Sequence Sampling, is afforded ten times the number of samples, it provides approximately the same degree of accuracy as the quadrature-based method. Two case studies on robust design confirmed the main conclusions and also suggest the quadrature-based method becomes more accurate as robustness improvements are made.

2000 ◽  
Vol 123 (1) ◽  
pp. 11-17 ◽  
Author(s):  
Jianmin Zhu ◽  
Kwun-Lon Ting

The paper presents the theory of performance sensitivity distribution and a novel robust parameter design technique. In the theory, a Jacobian matrix describes the effect of the component tolerance to the system performance, and the performance distribution is characterized in the variation space by a set of eigenvalues and eigenvectors. Thus, the feasible performance space is depicted as an ellipsoid. The size, shape, and orientation of the ellipsoid describe the quantity as well as quality of the feasible space and, therefore, the performance sensitivity distribution against the tolerance variation. The robustness of a design is evaluated by comparing the fitness between the ellipsoid feasible space and the tolerance space, which is a block, through a set of quantitative and qualitative indexes. The robust design can then be determined. The design approach is demonstrated in a mechanism design problem. Because of the generality of the analysis theory, the method can be used in any design situation as long as the relationship between the performance and design variables can be expressed analytically.


2016 ◽  
Vol 18 (6) ◽  
pp. 1007-1018
Author(s):  
M. A. Aziz ◽  
M. A. Imteaz ◽  
H. M. Rasel ◽  
M. Samsuzzoha

A novel ‘Comb Separator’ was developed and tested with the aim of improving sewer solids capture efficiency and reducing blockages on the screen. Experimental results were compared against the industry standard ‘Hydro-Jet™’ screen. Analysing the parameter sensitivity of a hydraulic screen is a standard practice to get better understanding of the device performance. In order to understand the uncertainties of the Comb Separator's input parameters, it is necessary to undertake sensitivity analysis; this will assist in making informed decisions regarding the use of this device. Such analysis will validate the device's performance in urban sewerage overflow scenarios. The methodology includes multiple linear regression and sampling using the standard Latin hypercube sampling technique to perform sensitivity analysis on different experimental parameters, such as flowrate, effective comb spacing, device runtime, weir opening and comb layers. The input parameters ‘weir opening’ and ‘comb layers’ have an insignificant influence on capture efficiency; hence, they were omitted from further analysis. Among the input parameters, ‘effective spacing’ was the most influential, followed by ‘inflow’ and ‘runtime’. These analyses provide better insights about the sensitivities of the parameters for practical application. This will assist device managers and operators to make informed decisions.


2012 ◽  
Vol 184-185 ◽  
pp. 316-319
Author(s):  
Liang Bo Ao ◽  
Lei Li ◽  
Yuan Sheng Li ◽  
Zhi Xun Wen ◽  
Zhu Feng Yue

The multi-objective design optimization of cooling turbine blade is studied using Kriging model. The optimization model is created, with the diameter of pin fin at the trailing edge of cooling turbine blade and the location, width, height of rib as design variable, the blade body temperature, flow resistance loss and aerodynamic efficiency as optimization object. The sample points are selected using Latin hypercube sampling technique, and the approximate model is created using Kriging method, the set of Pareto-optimal solutions of optimization objects is obtained by the multi-object optimization model using elitist non-dominated sorting genetic algorithm (NSGA-Ⅱ) based on the approximate model. The result shows that the conflict among all optimization objects is solved effectively and the feasibility of the optimization method is improved.


Author(s):  
Hiroto Itoh ◽  
Xiaoyu Zheng ◽  
Hitoshi Tamaki ◽  
Yu Maruyama

The influence of the in-vessel melt progression on the uncertainty of source terms was examined in the uncertainty analysis with integral severe accident analysis code MELCOR (Ver. 1.8.5), taking the accident at Unit 2 of the Fukushima Daiichi nuclear power plant as an example. The 32 parameters selected from the rough screening analysis were sampled by Latin hypercube sampling technique in accordance with the uncertainty distributions specified for each parameter. The uncertainty distributions of the outputs, including the source terms of the representative radioactive materials (Cs, CsI, Te and Ba), the total mass of in-vessel H2 generation and the total debris mass released from the reactor pressure vessel to the drywell, were obtained through the uncertainty analysis with an assumption of the failure of drywell. Based on various types of correlation coefficient for each parameter, 9 significant uncertain parameters potentially dominating the source terms were identified. These 9 parameters were transferred to the subsequent sensitivity and uncertainty analyses, in which the influence of the transportation of radioactive materials was taken into account.


1996 ◽  
Vol 465 ◽  
Author(s):  
Christian Ekberg ◽  
Allan T. Emrén ◽  
Anders Samuelsson

ABSTRACTThe use of computer simulations in the performance assessment for a repository for spent nuclear fuel, are in many cases the only method to get information on how the rock-repository system will work. One important factor is the solubility of the elements released if the repository is breached. This solubility may be determined experimentally or simulated. Ifit is simulated, several factors such as thermodynamical uncertainties will affect the reliability of the results. If these uncertainties are assumed to be small, the composition of the water used in the calculations may play a major part in the uncertainties in solubility. The water composition, in tum, is either determined experimentally or calculated through water-rock interactions. Thus, if the mineral composition of the rock is known, it is possible to foresee the water composition. However, in most cases a determination of the rock composition is made from drilling cores and is thus quite uncertain. Therefore, if solubility calculations are to be based on water properties calculated from rock-water interactions another uncertainty is introduced. This paper is focused on uncertainty and sensitivity analysis of rock-water interaction simulations and the uncertainties thus obtained are propagated through a program making uncertainty and sensitivity analysis of the solubility calculations. In both cases the latin hypercube sampling technique have been used. The results show that the solubilities are in most cases log normal distributed while the different elements in the simulated groundwater in some cases diverge significantly from such a distribution. The numerical results are comforting in that the uncertainty intervals of the solubilities are rather small, i.e. up to 30%.


2017 ◽  
Vol 28 (1) ◽  
pp. 30-31
Author(s):  
Abu Tarek Iqbal ◽  
Jalal Uddin ◽  
Dhiman Banik ◽  
Salehuddin ◽  
Hasan Mamun ◽  
...  

Many studies were conducted on the topic over the whole world but there is none in Chittagong, Bangladesh. To know the pattern of coronary artery stenosis in Chittagong we have conducted the study because it is important for effective case management. It was an observational study. Convenient sampling technique was used and sample size was fixed to 110 considering resource constraints. All the cases were diagnosed on the basis of history, clinical features and laboratory investigations. Coronary artery angiogram was methodically conducted. All relevant data had been recorded and were managed manually. The findings were validated statistically. Discussion was made with updated literature review and finally conclusion was drawn. Total 110 cases were studied. Stenosis was found in 77(70%) cases. Among them 83% were male and 17% were female. Age range was 30-80 years but 76% cases were of 40-60 years age group. Among the stenosed cases SVD 29%, DVD 20% and TVD 20% only. Only 01% was LMCA. Commonest stenosed vessel was LAD 71%. RCA 60%, LCX 58% and LMCA 6%. 47% of stenosed cases were found with normal ECG. Ejection fraction of 57% stenosed cases was >55%. Study results are not significantly apart from studies in home and abroad. The limitation is small sample size. So, a multicenter study on a large scale cases is hereby advocated for a conclusive opinionMedicine Today 2016 Vol.28(1): 30-31


Author(s):  
Jianmin Zhu ◽  
Kwun-Lon Ting

Abstract The paper presents the theory of performance sensitivity distribution and a novel robust parameter design technique. In the theory, a Jacobian matrix describes the effect of the component tolerance to the system performance, and the performance distribution is characterized in the variation space by a set of eigenvalues and eigenvectors. Thus, the feasible performance space is depicted as an ellipsoid. The size, shape, and orientation of the ellipsoid describe the quantity as well as quality of the feasible space and, therefore, the performance sensitivity distribution against the tolerance variation. The robustness of a design is evaluated by comparing the fitness between the ellipsoid feasible space and the tolerance space, which is a block, through a set of quantitative and qualitative indexes. The robust design can then be determined. The design approach is demonstrated in a mechanism design problem. Because of the generality of the analysis theory, the method can be used in any design situation as long as the relationship between the performance and design variables can be expressed analytically.


2014 ◽  
Vol 635-637 ◽  
pp. 274-280
Author(s):  
Wei Zhao ◽  
Na Zhou ◽  
Yi Min Zhang

Based on the Sobol’ theory, Hamma and Saltelli indicator system, the path parameters were divided into the different subsets for the vibration transfer path systems with random paths in this paper. The object function was constructed and decomposed, thus the global sensitivity based on variances was formed. The first-order and total sensitivities of various parameter subsets were calculated using Monte-Carlo simulation method combined with Latin hypercube sampling technique. Through the analysis example, the way is feasible to analyze the interaction of paths or path parameters in the vibration transfer path systems with uncertainty.


2018 ◽  
Vol 2 (4) ◽  
pp. 59 ◽  
Author(s):  
Rafael Rosa ◽  
Maria Loja ◽  
Alda Carvalho

Functionally graded composite materials may constitute an advantageous alternative to engineering applications, allying a customized tailoring capability to its inherent continuous properties transition. However, these attractive characteristics must account for the uncertainty that affects these materials and their structures’ physical quantities. Therefore, it is important to analyze how this uncertainty will modify the foreseen deterministic response of a structure that is built with these materials, identifying which of the parameters are responsible for a greater impact. To pursue this main objective, the material and geometrical parameters that characterize a plate made of an exponentially graded material are generated according to a random multivariate normal distribution, using the Latin hypercube sampling technique. Then, a set of finite element analyses based on the first-order shear deformation theory are performed to characterize the linear static responses of these plates, which are further correlated to the input parameters. This work also considers the constitution of statistic models in order to allow their use as alternative prediction models. The results show that for the plates that were analyzed, the uncertainty associated with the elasticity modulus of both phases is mainly responsible for the maximum transverse deflection variability. The effectiveness of the statistical models that are built are also shown.


Sign in / Sign up

Export Citation Format

Share Document