Monte Carlo simulation of air sampling methods for the measurement of radon decay products

2017 ◽  
Vol 126 ◽  
pp. 4-8 ◽  
Author(s):  
Octavian Sima ◽  
Aurelian Luca ◽  
Maria Sahagia
2017 ◽  
Author(s):  
Στυλιανός Λιοδάκης

Uncertainty is endemic in geospatial data due to the imperfect means of recording, processing, and representing spatial information. Propagating geospatial model inputs inherent uncertainty to uncertainty in model predictions is a critical requirement in each model's impact assessment and risk-conscious policy decision-making. It is still extremely difficult, however, to perform in practice uncertainty analysis of model outputs, particularly in complex spatially distributed environmental models, partially due to computational constraints.In the field of groundwater hydrology, the "stochastic revolution" has produced an enormous number of theoretical publications and greatly influenced our perspective on uncertainty and heterogeneity; it has had relatively little impact, however, on practical modeling. Monte Carlo simulation using simple random (SR) sampling from a multivariate distribution is one of the most widely used family of methods for uncertainty propagation in hydrogeological flow and transport model predictions, the other being analytical propagation.Real-life hydrogeological problems however, consist of complex and non-linear three dimensional groundwater models with millions of nodes and irregular boundary conditions. The number of Monte-Carlo runs required in these cases, depends on the number of uncertain parameters and on the relative accuracy required for the distribution of model predictions. In the context of sensitivity studies, inverse modelling or Monte-Carlo analyses, the ensuing computational burden is usually overwhelming and computationally impractical. These tough computational constrains have to be relaxed and removed before meaningful stochastic groundwater modeling applications are possible.A computationally efficient alternative to classical Monte Carlo simulation based on SR sampling is Latin hypercube (LH) sampling, a form of stratified random sampling. The latter yields a more representative distribution of model outputs (in terms of smaller sampling variability of their statistics) for the same number of input simulated realizations. The ability to generate unbiased LH realizations becomes critical in a spatial context, where random variables are geo-referenced and exhibit spatial correlation, to ensure unbiased outputs of complex models. On this regard, this dissertation offers a detailed analysis of LH sampling and compares it with SR sampling in a hydrogeological context. Additionally, two alternative stratified sampling methods, here named stratified likelihood (SL) sampling and minimum energy (ME) sampling, are examined (proposed in a spatial context) and their efficiency is further compared to SR and LH in a hydrogeological context; also accounting for the uncertainty related to the particular model at hand via a two step sampling method. All three stratified sampling methods (accounting for model sensitivity in the second case study) were found in this work to be more efficient than simple random sampling.Additionally, this thesis proposes a novel method for the expansion of the application domain of LH sampling to very large regular grids which is the common case in environmental (hydrogeological or not) models. More specifically, a novel combination of Stein's Latin Hypercube sampling with a Monte Carlo simulation method applicable over high discretization domains is proposed, and its performance is further validated in 2D and 3D hydrogeological problems of flow and transport in a mid-heterogeneous porous media, both consisting of about $1$ million nodes. Last, an additional novel extension of the proposed LH sampling on large grids is adopted for conditional high discretized problems. In this case too, the performance of the proposed approach is evaluated in a 3D hydrogeological model of flow and transport. Results indicate that both extensions (conditional and not) of LH sampling on large grids facilitate efficient uncertainty propagation with fewer model runs due to more representative model inputs. Overall, it could be argued that all the proposed methodological approaches could reduce the time and computer resources required to perform uncertainty analysis in hydrogeological flow and transport problems. Additionally, since it is the first time that stratified sampling is performed over high discretization domains, it could be argued that the proposed extensions of LH sampling on large grids could be considered a milestone for future uncertainty analysis efforts. Moreover, all the proposed stratified methods could contribute to a wider application of uncertainty analysis endeavors in a Monte Carlo framework for any spatially distributed impact assessment study.


Author(s):  
H. R. Millwater ◽  
A. J. Smalley ◽  
Y.-T. Wu ◽  
T. Y. Torng ◽  
B. F. Evans

This paper reports on some advanced computational techniques for probabilistic analysis of turbomachinery. A description of the requirements for probabilistic analysis and several solution methods are summarized. The traditional probabilistic analysis method, Monte Carlo simulation, and two advanced techniques, the Advanced Mean Value (AMV) method and importance sampling, are discussed. The performance of the Monte Carlo, AMV, and importance sampling methods is explored through a forced response analysis of a two degree-of-freedom Jeffcott rotor model. Variations in rotor weight, shaft length, shaft diameter, Young’s modulus, foundation stiffness, bearing clearance, viscosity, and length are considered. The cumulative distribution function of transmitted force is computed using Monte Carlo simulation and AMV at several RPM. Also, importance sampling is used to compute the probability of transmitted force exceeding a specified limit at several RPM. In both cases, the AMV and importance sampling methods are shown to give accurate solutions with far fewer number of simulations than the Monte Carlo method. These methods enable the engineer to perform accurate and efficient probabilistic analysis of realistic complex rotor dynamic models.


2018 ◽  
Vol 24 (3) ◽  
pp. 165-178 ◽  
Author(s):  
Kenza Tamiti ◽  
Megdouda Ourbih-Tari ◽  
Abdelouhab Aloui ◽  
Khelidja Idjis

Abstract This paper deals with Monte Carlo simulation and focuses on the use of the concepts of variance reduction, relative error and bias in testing the performance of stationary M/G/1 retrial queues estimators using either Random Sampling (RS) or Refined Descriptive Sampling (RDS) to generate input samples. For this purpose, a software under Linux system using the C compiler was designed and realized providing the performance measures of such system and the statistical concepts of bias, relative error and accuracy using both sampling methods. As a conclusion, it has been shown that the performance of stationary M/G/1 retrial queues estimators is better using RDS than RS and sometimes by a substantial variance reduction factor.


Author(s):  
Ryuichi Shimizu ◽  
Ze-Jun Ding

Monte Carlo simulation has been becoming most powerful tool to describe the electron scattering in solids, leading to more comprehensive understanding of the complicated mechanism of generation of various types of signals for microbeam analysis.The present paper proposes a practical model for the Monte Carlo simulation of scattering processes of a penetrating electron and the generation of the slow secondaries in solids. The model is based on the combined use of Gryzinski’s inner-shell electron excitation function and the dielectric function for taking into account the valence electron contribution in inelastic scattering processes, while the cross-sections derived by partial wave expansion method are used for describing elastic scattering processes. An improvement of the use of this elastic scattering cross-section can be seen in the success to describe the anisotropy of angular distribution of elastically backscattered electrons from Au in low energy region, shown in Fig.l. Fig.l(a) shows the elastic cross-sections of 600 eV electron for single Au-atom, clearly indicating that the angular distribution is no more smooth as expected from Rutherford scattering formula, but has the socalled lobes appearing at the large scattering angle.


Author(s):  
D. R. Liu ◽  
S. S. Shinozaki ◽  
R. J. Baird

The epitaxially grown (GaAs)Ge thin film has been arousing much interest because it is one of metastable alloys of III-V compound semiconductors with germanium and a possible candidate in optoelectronic applications. It is important to be able to accurately determine the composition of the film, particularly whether or not the GaAs component is in stoichiometry, but x-ray energy dispersive analysis (EDS) cannot meet this need. The thickness of the film is usually about 0.5-1.5 μm. If Kα peaks are used for quantification, the accelerating voltage must be more than 10 kV in order for these peaks to be excited. Under this voltage, the generation depth of x-ray photons approaches 1 μm, as evidenced by a Monte Carlo simulation and actual x-ray intensity measurement as discussed below. If a lower voltage is used to reduce the generation depth, their L peaks have to be used. But these L peaks actually are merged as one big hump simply because the atomic numbers of these three elements are relatively small and close together, and the EDS energy resolution is limited.


Sign in / Sign up

Export Citation Format

Share Document