scholarly journals Fast Calorimeter Simulation in ATLAS

2020 ◽  
Vol 245 ◽  
pp. 02002
Author(s):  
Sean Gasiorowski ◽  
Heather Gray

The ATLAS physics program at the LHC relies on very large samples of simulated events. Most of these samples are produced with Geant4, which provides a highly detailed and accurate simulation of the ATLAS detector. However, this accuracy comes with a high price in CPU, and the sensitivity of many physics analyses is already limited by the available Monte Carlo statistics and will be even more so in the future as datasets grow. To solve this problem, sophisticated fast simulation tools are developed, and they will become the default tools in ATLAS production in Run 3 and beyond. The slowest component is the simulation of the calorimeter showers. Those are replaced by a new parametrised description of the longitudinal and lateral energy deposits, including machine learning approaches, achieving a fast but accurate description. In this talk we will describe the new tool for fast calorimeter simulation that has been developed by ATLAS, review its technical and physics performance, and demonstrate its potential to transform physics analyses.

2020 ◽  
Vol 245 ◽  
pp. 02035
Author(s):  
John Chapman ◽  
Kyle Cranmer ◽  
Stefan Gadatsch ◽  
Tobias Golling ◽  
Aishik Ghosh ◽  
...  

The ATLAS physics program relies on very large samples of Geant4 simulated events, which provide a highly detailed and accurate simulation of the ATLAS detector. However, this accuracy comes with a high price in CPU, and the sensitivity of many physics analyses is already limited by the available Monte Carlo statistics and will be even more so in the future. Therefore, sophisticated fast simulation tools have been developed. In Run 3 we aim to replace the calorimeter shower simulation for most samples with a new parametrised description of longitudinal and lateral energy deposits, including machine learning approaches, to achieve a fast and accurate description. Looking further ahead, prototypes are being developed using cutting edge machine learning approaches to learn the appropriate calorimeter response, which are expected to improve modeling of correlations within showers. Two different approaches, using Variational Auto-Encoders (VAEs) or Generative Adversarial Networks (GANs), are trained to model the shower simulation. Additional fast simulation tools will replace the inner detector simulation, as well as digitization and reconstruction algorithms, achieving up to two orders of magnitude improvement in speed. In this talk, we will describe the new tools for fast production of simulated events and an exploratory analysis of the deep learning methods.


2019 ◽  
Vol 214 ◽  
pp. 02010 ◽  
Author(s):  
Sofia Vallecorsa ◽  
Federico Carminati ◽  
Gulrukh Khattak

Machine Learning techniques have been used in different applications by the HEP community: in this talk, we discuss the case of detector simulation. The need for simulated events, expected in the future for LHC experiments and their High Luminosity upgrades, is increasing dramatically and requires new fast simulation solutions. We describe an R&D activity aimed at providing a configurable tool capable of training a neural network to reproduce the detector response and speed-up standard Monte Carlo simulation. This represents a generic approach in the sense that such a network could be designed and trained to simulate any kind of detector and, eventually, the whole data processing chain in order to get, directly in one step, the final reconstructed quantities, in just a small fraction of time. We present the first application of three-dimensional convolutional Generative Adversarial Networks to the simulation of high granularity electromagnetic calorimeters. We describe detailed validation studies comparing our results to Geant4 Monte Carlo simulation. Finally we show how this tool could be generalized to describe a whole class of calorimeters, opening the way to a generic machine learning based fast simulation approach.


Energies ◽  
2020 ◽  
Vol 13 (18) ◽  
pp. 4868
Author(s):  
Raghuram Kalyanam ◽  
Sabine Hoffmann

Solar radiation data is essential for the development of many solar energy applications ranging from thermal collectors to building simulation tools, but its availability is limited, especially the diffuse radiation component. There are several studies aimed at predicting this value, but very few studies cover the generalizability of such models on varying climates. Our study investigates how well these models generalize and also show how to enhance their generalizability on different climates. Since machine learning approaches are known to generalize well, we apply them to truly understand how well they perform on different climates than they are originally trained. Therefore, we trained them on datasets from the U.S. and tested on several European climates. The machine learning model that is developed for U.S. climates not only showed low mean absolute error (MAE) of 23 W/m2, but also generalized very well on European climates with MAE in the range of 20 to 27 W/m2. Further investigation into the factors influencing the generalizability revealed that careful selection of the training data can improve the results significantly.


Particles ◽  
2021 ◽  
Vol 4 (2) ◽  
pp. 227-235
Author(s):  
Aleksandr Svetlichnyi ◽  
Roman Nepeyvoda ◽  
Igor Pshenichnov

One of the common methods to measure the centrality of nucleus-nucleus collision events consists of detecting forward spectator neutrons. Because of non-monotonic dependence of neutron numbers on centrality, other characteristics of spectator matter in 197Au–197Au collisions at NICA must be considered to improve the centrality determination. The numbers of spectator deuterons and α-particles and the forward–backward asymmetry of the numbers of free spectator nucleons were calculated with the Abrasion–Ablation Monte Carlo for Colliders (AAMCC) model as functions of event centrality. It was shown that the number of charged fragments per spectator nucleon decreases monotonically with an increase of the impact parameter, and thus can be used to estimate the collision centrality. The conditional probabilities that a given event with specific spectator characteristics belongs to a certain centrality class were calculated by means of AAMCC. Such probabilities can be used as an input to Bayesian or other machine-learning approaches to centrality determination in 197Au–197Au collisions.


Foods ◽  
2021 ◽  
Vol 10 (10) ◽  
pp. 2472
Author(s):  
Shogo Okamoto

In the last decade, temporal dominance of sensations (TDS) methods have proven to be potent approaches in the field of food sciences. Accordingly, thus far, methods for analyzing TDS curves, which are the major outputs of TDS methods, have been developed. This study proposes a method of bootstrap resampling for TDS tasks. The proposed method enables the production of random TDS curves to estimate the uncertainties, that is, the 95% confidence interval and standard error of the curves. Based on Monte Carlo simulation studies, the estimated uncertainties are considered valid and match those estimated by approximated normal distributions with the number of independent TDS tasks or samples being 50–100 or greater. The proposed resampling method enables researchers to apply statistical analyses and machine-learning approaches that require a large sample size of TDS curves.


2019 ◽  
Vol 70 (3) ◽  
pp. 214-224
Author(s):  
Bui Ngoc Dung ◽  
Manh Dzung Lai ◽  
Tran Vu Hieu ◽  
Nguyen Binh T. H.

Video surveillance is emerging research field of intelligent transport systems. This paper presents some techniques which use machine learning and computer vision in vehicles detection and tracking. Firstly the machine learning approaches using Haar-like features and Ada-Boost algorithm for vehicle detection are presented. Secondly approaches to detect vehicles using the background subtraction method based on Gaussian Mixture Model and to track vehicles using optical flow and multiple Kalman filters were given. The method takes advantages of distinguish and tracking multiple vehicles individually. The experimental results demonstrate high accurately of the method.


2017 ◽  
Author(s):  
Sabrina Jaeger ◽  
Simone Fulle ◽  
Samo Turk

Inspired by natural language processing techniques we here introduce Mol2vec which is an unsupervised machine learning approach to learn vector representations of molecular substructures. Similarly, to the Word2vec models where vectors of closely related words are in close proximity in the vector space, Mol2vec learns vector representations of molecular substructures that are pointing in similar directions for chemically related substructures. Compounds can finally be encoded as vectors by summing up vectors of the individual substructures and, for instance, feed into supervised machine learning approaches to predict compound properties. The underlying substructure vector embeddings are obtained by training an unsupervised machine learning approach on a so-called corpus of compounds that consists of all available chemical matter. The resulting Mol2vec model is pre-trained once, yields dense vector representations and overcomes drawbacks of common compound feature representations such as sparseness and bit collisions. The prediction capabilities are demonstrated on several compound property and bioactivity data sets and compared with results obtained for Morgan fingerprints as reference compound representation. Mol2vec can be easily combined with ProtVec, which employs the same Word2vec concept on protein sequences, resulting in a proteochemometric approach that is alignment independent and can be thus also easily used for proteins with low sequence similarities.


Sign in / Sign up

Export Citation Format

Share Document