Variance Reduction for Simulating Transient GI/G/1 Behavior

1996 ◽  
Vol 10 (2) ◽  
pp. 197-205 ◽  
Author(s):  
Søren Asmussen ◽  
Chia-Li Wang

A variety of methods for reducing the variance on Monte Carlo estimators of the expected waiting time Wn of the nth customer in a GI/G/1 queue are studied. The ideas involve Spitzer's identity, importance sampling, and sums with stratified or controlled randomized length.

Author(s):  
Ximing Li ◽  
Changchun Li ◽  
Jinjin Chi ◽  
Jihong Ouyang

Overdispersed black-box variational inference employs importance sampling to reduce the variance of the Monte Carlo gradient in black-box variational inference. A simple overdispersed proposal distribution is used. This paper aims to investigate how to adaptively obtain better proposal distribution for lower variance. To this end, we directly approximate the optimal proposal in theory using a Monte Carlo moment matching step at each variational iteration. We call this adaptive proposal moment matching proposal (MMP). Experimental results on two Bayesian models show that the MMP can effectively reduce variance in black-box learning, and perform better than baseline inference algorithms.


Author(s):  
Song Xing ◽  
Bernd-Peter Paris ◽  
Xiannong Meng

The Internet’s complexity restricts analysis or simulation to assess its parameters. Instead, actual measurements provide a reality check. Many statistical measurements of the Internet estimate rare event probabilities. Collection of such statistics renders sampling methods as a primary substitute. Within the context of this inquiry, we have presented the conventional Monte Carlo approach to estimate the Internet event probability. As a variance reduction technique, Importance Sampling is introduced which is a modified Monte Carlo approach resulting in a significant reduction of effort to obtain an accurate estimate. This method works particularly well when estimating the probability of rare events. It has great appeal to use as an efficient sampling scheme for estimating the information server density on the Internet. In this chapter, we have proposed the Importance Sampling approaches to track the prevalence and growth of Web service, where an improved Importance Sampling scheme is introduced. We present a thorough analysis of the sampling approaches. Based on the periodic measurement of the number of active Web servers conducted over the past five years, an exponential growth of the Web is observed and modeled. Also discussed in this chapter is the increasing security concerns on Web servers.


Author(s):  
Utkarsh A. Mishra ◽  
Ankit Bansal

Abstract The radiative heat transfer phenomenon is a complex process with various events of absorption, emission, and scattering of photon rays. Moreover, the effect of a participating medium adds to the complexity. Existing analytical methods fail to achieve accurate results with all such phenomena. In such cases, brute force algorithms such as the Monte Carlo Ray Tracing (MCRT) or the Photon Monte Carlo (PMC) has gained a lot of importance. But such processes, even if they provide less error than analytical methods, are quite expensive in computation time. Moreover, there are various shortcomings with traditional PMC in effectively including the nature of the participating medium and high variance in results. In this study, a modified PMC is simulated for a one-dimensional medium-surface radiation exchange problem. The medium is taken to be CO (4+) band system, and the behaviour is modelled by Importance Sampling (IS) of the spectrum data for variance reduction. Furthermore, PMC with low-discrepancy sequences like Halton, Sobol, and Faure sequences, known as Quasi-Monte Carlo (QMC), was simulated. QMC proved to be more efficient in reducing variance and computation time. Effective IS included with QMC is observed to have a much smaller variance and is faster as compared to traditional PMC.


2020 ◽  
Vol 225 ◽  
pp. 02003
Author(s):  
Bor Kos ◽  
Theodora Vasilopoulou ◽  
Scott W. Mosher ◽  
Ivan A. Kodeli ◽  
Robert E. Grove ◽  
...  

The paper presents an analysis of DD, TT and DT neutron streaming benchmark experiments with the recently released hybrid transport code ADVANTG (AutomateD VAriaNce reducTion Generator). ADVANTG combines the deterministic neutron transport solver Denovo with the Monte Carlo transport code MCNP via the principle of variance reduction. It automatically produces weight-window and source biasing variance reduction parameters based on the CADIS (Consistent Adjoint Driven Importance Sampling) methodology. Using this novel hybrid methodology Monte Carlo simulations of realistic complex fusion streaming geometries have become possible. In this paper the experimental results from the 2016 DD campaign using measurements with TLDs and activation foils up to 40 m from the plasma source are analyzed. New detailed models of the detector assemblies were incorporated into the JET 360° MCNP model for this analysis. In preparation of the TT and DTE2 campaigns at JET a pre-analysis for these campaigns is also presented.


Author(s):  
Richard Dawson ◽  
Jim Hall

Complex civil infrastructure systems are typically exposed to random loadings and have a large number of possible failure modes, which often exhibit spatially and temporally variable consequences. Monte Carlo (level III) reliability methods are attractive because of their flexibility and robustness, yet computational expense may be prohibitive, in which case variance reduction methods are required. In the importance sampling methodology presented here, the joint probability distribution of the loading variables is sampled according to the contribution that a given region in the joint space makes to risk, rather than according to probability of failure, which is the conventional importance sampling criterion in structural reliability analysis. Results from simulations are used to intermittently update the importance sampling density function based on the evaluations of the (initially unknown) risk function. The methodology is demonstrated on a propped cantilever beam system and then on a real coastal dike infrastructure system in the UK. The case study demonstrates that risk can be a complex function of loadings, the resistance and interactions of system components and the spatially variable damage associated with different modes of system failure. The methodology is applicable in general to Monte Carlo risk analysis of systems, but it is likely to be most beneficial where consequences of failure are a nonlinear function of load and where system simulation requires significant computational resources.


2018 ◽  
Vol 482 (6) ◽  
pp. 627-630
Author(s):  
D. Belomestny ◽  
◽  
L. Iosipoi ◽  
N. Zhivotovskiy ◽  
◽  
...  

Sign in / Sign up

Export Citation Format

Share Document