scholarly journals Prototype of the Russian Scientific Data Lake

2021 ◽  
Vol 251 ◽  
pp. 02031
Author(s):  
Aleksandr Alekseev ◽  
Xavier Espinal ◽  
Stephane Jezequel ◽  
Andrey Kiryanov ◽  
Alexei Klimentov ◽  
...  

The High Luminosity phase of the LHC, which aims for a tenfold increase in the luminosity of proton-proton collisions is expected to start operation in eight years. An unprecedented scientific data volume at the multiexabyte scale will be delivered to particle physics experiments at CERN. This amount of data has to be stored and the corresponding technology must ensure fast and reliable data delivery for processing by the scientific community all over the world. The present LHC computing model will not be able to provide the required infrastructure growth even taking into account the expected hardware evolution. To address this challenge the Data Lake R&D project has been launched by the DOMA community in the fall of 2019. State-of-the-art data handling technologies are under active development, and their current status for the Russian Scientific Data Lake prototype is presented here.

2021 ◽  
Vol 251 ◽  
pp. 02006
Author(s):  
Mikhail Borodin ◽  
Alessandro Di Girolamo ◽  
Edward Karavakis ◽  
Alexei Klimentov ◽  
Tatiana Korchuganova ◽  
...  

The High Luminosity upgrade to the LHC, which aims for a tenfold increase in the luminosity of proton-proton collisions at an energy of 14 TeV, is expected to start operation in 2028/29 and will deliver an unprecedented volume of scientific data at the multi-exabyte scale. This amount of data has to be stored, and the corresponding storage system must ensure fast and reliable data delivery for processing by scientific groups distributed all over the world. The present LHC computing and data management model will not be able to provide the required infrastructure growth, even taking into account the expected hardware technology evolution. To address this challenge, the Data Carousel R&D project was launched by the ATLAS experiment in the fall of 2018. State-of-the-art data and workflow management technologies are under active development, and their current status is presented here.


2021 ◽  
Vol 251 ◽  
pp. 02002
Author(s):  
David Cameron ◽  
Alessandra Forti ◽  
Alexei Klimentov ◽  
Andrés Pacheco Pages ◽  
David South

The High Luminosity LHC project at CERN, which is expected to deliver a ten-fold increase in the luminosity of proton-proton collisions over LHC, will start operation towards the end of this decade and will deliver an unprecedented scientific data volume of multi-exabyte scale. This vast amount of data has to be processed and analysed, and the corresponding computing facilities must ensure fast and reliable data processing for physics analyses by scientific groups distributed all over the world. The present LHC computing model will not be able to provide the required infrastructure growth, even taking into account the expected evolution in hardware technology. To address this challenge, several novel methods of how end-users analysis will be conducted are under evaluation by the ATLAS Collaboration. State-of-the-art workflow management technologies and tools to handle these methods within the existing distributed computing system are now being evaluated and developed. In addition the evolution of computing facilities and how this impacts ATLAS analysis workflows is being closely followed.


2014 ◽  
Vol 29 (23) ◽  
pp. 1430041 ◽  
Author(s):  
Andrew Askew ◽  
Sushil Chauhan ◽  
Björn Penning ◽  
William Shepherd ◽  
Mani Tripathi

Theoretical and experimental techniques employed in dedicated searches for dark matter at hadron colliders are reviewed. Bounds from the 7 TeV and 8 TeV proton–proton collisions at the Large Hadron Collider (LHC) on dark matter interactions have been collected and the results interpreted. We review the current status of the Effective Field Theory picture of dark matter interactions with the Standard Model. Currently, LHC experiments have stronger bounds on operators leading to spin-dependent scattering than direct detection experiments, while direct detection probes are more constraining for spin-independent scattering for WIMP masses above a few GeV.


Author(s):  
S. Acharya ◽  
◽  
D. Adamová ◽  
S. P. Adhya ◽  
A. Adler ◽  
...  

Abstract The production rates and the transverse momentum distribution of strange hadrons at mid-rapidity ($$\left| y\right| < 0.5$$y<0.5) are measured in proton-proton collisions at $$\sqrt{s}$$s = 13 TeV as a function of the charged particle multiplicity, using the ALICE detector at the LHC. The production rates of $$\mathrm{K}^{0}_{S}$$KS0, $$\Lambda $$Λ, $$\Xi $$Ξ, and $$\Omega $$Ω increase with the multiplicity faster than what is reported for inclusive charged particles. The increase is found to be more pronounced for hadrons with a larger strangeness content. Possible auto-correlations between the charged particles and the strange hadrons are evaluated by measuring the event-activity with charged particle multiplicity estimators covering different pseudorapidity regions. When comparing to lower energy results, the yields of strange hadrons are found to depend only on the mid-rapidity charged particle multiplicity. Several features of the data are reproduced qualitatively by general purpose QCD Monte Carlo models that take into account the effect of densely-packed QCD strings in high multiplicity collisions. However, none of the tested models reproduce the data quantitatively. This work corroborates and extends the ALICE findings on strangeness production in proton-proton collisions at 7 TeV.


Sign in / Sign up

Export Citation Format

Share Document