scholarly journals Centered $L_2$-discrepancy of random sampling and Latin hypercube design, and construction of uniform designs

2000 ◽  
Vol 71 (237) ◽  
pp. 275-297 ◽  
Author(s):  
Kai-Tai Fang ◽  
Chang-Xing Ma ◽  
Peter Winker
2003 ◽  
Vol 125 (2) ◽  
pp. 210-220 ◽  
Author(s):  
G. Gary Wang

This paper addresses the difficulty of the previously developed Adaptive Response Surface Method (ARSM) for high-dimensional design problems. ARSM was developed to search for the global design optimum for computation-intensive design problems. This method utilizes Central Composite Design (CCD), which results in an exponentially increasing number of required design experiments. In addition, ARSM generates a complete new set of CCD points in a gradually reduced design space. These two factors greatly undermine the efficiency of ARSM. In this work, Latin Hypercube Design (LHD) is utilized to generate saturated design experiments. Because of the use of LHD, historical design experiments can be inherited in later iterations. As a result, ARSM only requires a limited number of design experiments even for high-dimensional design problems. The improved ARSM is tested using a group of standard test problems and then applied to an engineering design problem. In both testing and design application, significant improvement in the efficiency of ARSM is realized. The improved ARSM demonstrates strong potential to be a practical global optimization tool for computation-intensive design problems. Inheriting LHD points, as a general sampling strategy, can be integrated into other approximation-based design optimization methodologies.


2012 ◽  
Vol 44 (5) ◽  
pp. 551-564 ◽  
Author(s):  
Huaguang Zhu ◽  
Li Liu ◽  
Teng Long ◽  
Lei Peng

Author(s):  
Augusto Hernandez-Solis ◽  
Christian Ekberg ◽  
Arvid O¨dega˚rd Jensen ◽  
Christophe Demaziere ◽  
Ulf Bredolt

In recent years, a more realistic safety analysis of nuclear reactors has been based on best estimate (BE) computer codes. Because their predictions are unavoidably affected by conceptual, aleatory and experimental sources of uncertainty, an uncertainty analysis is needed if useful conclusions are to be obtained from BE codes. In this paper, statistical uncertainty analyses of cross-sectional averaged void fraction calculations using the POLCA-T system code, and based on the BWR Full-Size Fine-Mesh Bundle Test (BFBT) benchmark are presented by means of two different sampling strategies: Latin Hypercube (LHS) and Simple Random Sampling (SRS). LHS has the property of densely stratifying across the range of each input probability distribution, allowing a much better coverage of the input uncertainties than SRS. The aim here is to compare both uncertainty analyses on the BWR assembly void axial profile prediction in steady-state, and on the transient void fraction prediction at a certain axial level coming from a simulated re-circulation pump trip scenario. It is shown that the replicated void fraction mean (either in steady-state or transient conditions) has less variability when using LHS than SRS for the same number of calculations (i.e. same input space sample size) even if the resulting void fraction axial profiles are non-monotonic. It is also shown that the void fraction uncertainty limits achieved with SRS by running 458 calculations (sample size required to cover 95% of 8 uncertain input parameters with a 95% confidence), result in the same uncertainty limits achieved by LHS with only 100 calculations. These are thus clear indications on the advantages of using LHS.


Sign in / Sign up

Export Citation Format

Share Document