scholarly journals Evolution of ATLAS analysis workflows and tools for the HL-LHC era

2021 ◽  
Vol 251 ◽  
pp. 02002
Author(s):  
David Cameron ◽  
Alessandra Forti ◽  
Alexei Klimentov ◽  
Andrés Pacheco Pages ◽  
David South

The High Luminosity LHC project at CERN, which is expected to deliver a ten-fold increase in the luminosity of proton-proton collisions over LHC, will start operation towards the end of this decade and will deliver an unprecedented scientific data volume of multi-exabyte scale. This vast amount of data has to be processed and analysed, and the corresponding computing facilities must ensure fast and reliable data processing for physics analyses by scientific groups distributed all over the world. The present LHC computing model will not be able to provide the required infrastructure growth, even taking into account the expected evolution in hardware technology. To address this challenge, several novel methods of how end-users analysis will be conducted are under evaluation by the ATLAS Collaboration. State-of-the-art workflow management technologies and tools to handle these methods within the existing distributed computing system are now being evaluated and developed. In addition the evolution of computing facilities and how this impacts ATLAS analysis workflows is being closely followed.

2021 ◽  
Vol 251 ◽  
pp. 02031
Author(s):  
Aleksandr Alekseev ◽  
Xavier Espinal ◽  
Stephane Jezequel ◽  
Andrey Kiryanov ◽  
Alexei Klimentov ◽  
...  

The High Luminosity phase of the LHC, which aims for a tenfold increase in the luminosity of proton-proton collisions is expected to start operation in eight years. An unprecedented scientific data volume at the multiexabyte scale will be delivered to particle physics experiments at CERN. This amount of data has to be stored and the corresponding technology must ensure fast and reliable data delivery for processing by the scientific community all over the world. The present LHC computing model will not be able to provide the required infrastructure growth even taking into account the expected hardware evolution. To address this challenge the Data Lake R&D project has been launched by the DOMA community in the fall of 2019. State-of-the-art data handling technologies are under active development, and their current status for the Russian Scientific Data Lake prototype is presented here.


2021 ◽  
Vol 251 ◽  
pp. 02006
Author(s):  
Mikhail Borodin ◽  
Alessandro Di Girolamo ◽  
Edward Karavakis ◽  
Alexei Klimentov ◽  
Tatiana Korchuganova ◽  
...  

The High Luminosity upgrade to the LHC, which aims for a tenfold increase in the luminosity of proton-proton collisions at an energy of 14 TeV, is expected to start operation in 2028/29 and will deliver an unprecedented volume of scientific data at the multi-exabyte scale. This amount of data has to be stored, and the corresponding storage system must ensure fast and reliable data delivery for processing by scientific groups distributed all over the world. The present LHC computing and data management model will not be able to provide the required infrastructure growth, even taking into account the expected hardware technology evolution. To address this challenge, the Data Carousel R&D project was launched by the ATLAS experiment in the fall of 2018. State-of-the-art data and workflow management technologies are under active development, and their current status is presented here.


Author(s):  
G. Aad ◽  
◽  
B. Abbott ◽  
D. C. Abbott ◽  
A. Abed Abud ◽  
...  

AbstractThe algorithms used by the ATLAS Collaboration during Run 2 of the Large Hadron Collider to identify jets containing b-hadrons are presented. The performance of the algorithms is evaluated in the simulation and the efficiency with which these algorithms identify jets containing b-hadrons is measured in collision data. The measurement uses a likelihood-based method in a sample highly enriched in $$t{\bar{t}}$$tt¯ events. The topology of the $$t \rightarrow W b$$t→Wb decays is exploited to simultaneously measure both the jet flavour composition of the sample and the efficiency in a transverse momentum range from 20 to 600 GeV. The efficiency measurement is subsequently compared with that predicted by the simulation. The data used in this measurement, corresponding to a total integrated luminosity of 80.5 $$\hbox {fb}^{-1}$$fb-1, were collected in proton–proton collisions during the years 2015–2017 at a centre-of-mass energy $$\sqrt{s}=$$s= 13 TeV. By simultaneously extracting both the efficiency and jet flavour composition, this measurement significantly improves the precision compared to previous results, with uncertainties ranging from 1 to 8% depending on the jet transverse momentum.


2017 ◽  
Vol 45 ◽  
pp. 1760062
Author(s):  
Mairon M. Machado ◽  
Magno V. T. Machado

In this contribution we provide predictions for total, elastic and single diffractive cross sections calculated for the proton-proton collisions at the LHC in centre-of-mass energies of 0.9, 7, 8 and 14 TeV. We consider the framework of the Miettinen-Pumplin model which correctly describes the lower energy data available by Fermilab-Tevatron. Our predictions are based on the fitted parameters of the model for the Tevatron measurements and for TOTEM-LHC measurements at 7 TeV. We extrapolate the results for the higher energies runs of LHC and provide predictions for them. We verify that those prediction are in agreement with the recent CERN-ATLAS Collaboration result.


2019 ◽  
Vol 214 ◽  
pp. 03049
Author(s):  
Fernando Barreiro Magino ◽  
David Cameron ◽  
Alessandro Di Girolamo ◽  
Andrej Filipcic ◽  
Ivan Glushkov ◽  
...  

ATLAS is one of the four experiments collecting data from the proton-proton collisions at the Large Hadron Collider. The offline processing and storage of the data is handled by a custom heterogenous distributed computing system. This paper summarizes some of the challenges and operations-driven solutions introduced in the system.


2020 ◽  
pp. 2141004
Author(s):  
Jinheung Kim ◽  
Taegyu Lee ◽  
Jeongwoo Kim ◽  
Ho Jang

We present the MadAnalysis 5 implementation and validation of the ATLAS-SUSY-2018-06 analysis. This analysis documents a search for electroweakinos with mass splittings larger than the [Formula: see text] boson mass. The targeted decay chain consists of electroweakinos decaying via on-shell [Formula: see text] and [Formula: see text] bosons to a three-lepton final state. The results are based on a dataset of 139 fb[Formula: see text] of proton-proton collisions, recorded by the ATLAS experiment with a center-of-mass energy of [Formula: see text] TeV between 2015 and 2018. The validation of our implementation relies on a comparison of our results against official cutflows provided by the ATLAS collaboration. The validation material provided by the ATLAS collaboration is based on well-defined benchmarks which feature chargino and neutralino associated production.


Sign in / Sign up

Export Citation Format

Share Document