Large-Scale Monte Carlo Simulation over Distributed Computing Infrastructures using HTCaaS

2013 ◽  
Author(s):  
Seoyoung Kim ◽  
◽  
Soo-hyeon Nam ◽  
2021 ◽  
Vol 36 (10) ◽  
pp. 2150070
Author(s):  
Maria Grigorieva ◽  
Dmitry Grin

Large-scale distributed computing infrastructures ensure the operation and maintenance of scientific experiments at the LHC: more than 160 computing centers all over the world execute tens of millions of computing jobs per day. ATLAS — the largest experiment at the LHC — creates an enormous flow of data which has to be recorded and analyzed by a complex heterogeneous and distributed computing environment. Statistically, about 10–12% of computing jobs end with a failure: network faults, service failures, authorization failures, and other error conditions trigger error messages which provide detailed information about the issue, which can be used for diagnosis and proactive fault handling. However, this analysis is complicated by the sheer scale of textual log data, and often exacerbated by the lack of a well-defined structure: human experts have to interpret the detected messages and create parsing rules manually, which is time-consuming and does not allow identifying previously unknown error conditions without further human intervention. This paper is dedicated to the description of a pipeline of methods for the unsupervised clustering of multi-source error messages. The pipeline is data-driven, based on machine learning algorithms, and executed fully automatically, allowing categorizing error messages according to textual patterns and meaning.


2016 ◽  
Vol 34 (4) ◽  
pp. 637-644 ◽  
Author(s):  
I.A. Artyukov ◽  
E.G. Bessonov ◽  
M.V. Gorbunkov ◽  
Y.Y. Maslova ◽  
N.L. Popov ◽  
...  

AbstractThe paper presents a general theoretical framework and related Monte Carlo simulation of novel type of the X-ray sources based on relativistic Thomson scattering of powerful laser radiation. Special attention is paid to the linac X-ray generators by way of two examples: conceptual design for production of 12.4 keV photons and presently operating X-ray source of 29.4 keV photons. Our analysis shows that state-of-the-art laser and accelerator technologies enable to build up a compact linac-based Thomson source for the same X-ray imaging and diffraction experiments as in using of a large-scale X-ray radiation facility like a synchrotron or Thomson generator based on electron storage ring.


2010 ◽  
Vol 219 (7) ◽  
pp. 072040 ◽  
Author(s):  
B Lobodzinski ◽  
E Bystritskaya ◽  
T M Karbach ◽  
S Mitsyn ◽  
M Mudrinic ◽  
...  

2005 ◽  
Vol 19 (24) ◽  
pp. 3731-3743 ◽  
Author(s):  
Q. L. ZHANG

The phase diagram of the single-orbit double exchange model for manganites with ferromagnetic Hund coupling between mobile eg electrons and spins of localized t2g electrons as well as antiferromagnetic superexchange coupling between t2g electrons is investigated with a large scale Monte Carlo simulation in one dimension. The phase boundary is determined based on the internal energy, the electron density and the structure factor. In particular, low-temperature properties at quarter filling are studied in detail.


2001 ◽  
Vol 38 (A) ◽  
pp. 176-187 ◽  
Author(s):  
Mark Bebbington ◽  
David S. Harte

The paper reviews the formulation of the linked stress release model for large scale seismicity together with aspects of its application. Using data from Taiwan for illustrative purposes, models can be selected and verified using tools that include Akaike's information criterion (AIC), numerical analysis, residual point processes and Monte Carlo simulation.


2013 ◽  
Vol 135 (9) ◽  
Author(s):  
Liang Zhao ◽  
K. K. Choi ◽  
Ikjin Lee ◽  
David Gorsich

In sampling-based reliability-based design optimization (RBDO) of large-scale engineering applications, the Monte Carlo simulation (MCS) is often used for the probability of failure calculation and probabilistic sensitivity analysis using the prediction from the surrogate model for the performance function evaluations. When the number of samples used to construct the surrogate model is not enough, the prediction from the surrogate model becomes inaccurate and thus the Monte Carlo simulation results as well. Therefore, to count in the prediction error from the surrogate model and assure the obtained optimum design from sampling-based RBDO satisfies the probabilistic constraints, a conservative surrogate model, which is not overly conservative, needs to be developed. In this paper, a conservative surrogate model is constructed using the weighted Kriging variance where the weight is determined by the relative change in the corrected Akaike Information Criterion (AICc) of the dynamic Kriging model. The proposed conservative surrogate model performs better than the traditional Kriging prediction interval approach because it reduces fluctuation in the Kriging prediction bound and it performs better than the constant safety margin approach because it adaptively accounts large uncertainty of the surrogate model in the region where samples are sparse. Numerical examples show that using the proposed conservative surrogate model for sampling-based RBDO is necessary to have confidence that the optimum design satisfies the probabilistic constraints when the number of samples is limited, while it does not lead to overly conservative designs like the constant safety margin approach.


Author(s):  
Vincent Perry ◽  
Wendy Gao ◽  
Michael Chen ◽  
J. Michael Barton ◽  
Simon Su

Sign in / Sign up

Export Citation Format

Share Document