scholarly journals TECHNOLOGIES FOR LARGE DATA MANAGEMENT IN SCIENTIFIC COMPUTING

2014 ◽  
Vol 25 (02) ◽  
pp. 1430001 ◽  
Author(s):  
ALBERTO PACE

In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.

2018 ◽  
Vol 68 (1) ◽  
pp. 291-312 ◽  
Author(s):  
Celine Degrande ◽  
Valentin Hirschi ◽  
Olivier Mattelaer

The automation of one-loop amplitudes plays a key role in addressing several computational challenges for hadron collider phenomenology: They are needed for simulations including next-to-leading-order corrections, which can be large at hadron colliders. They also allow the exact computation of loop-induced processes. A high degree of automation has now been achieved in public codes that do not require expert knowledge and can be widely used in the high-energy physics community. In this article, we review many of the methods and tools used for the different steps of automated one-loop amplitude calculations: renormalization of the Lagrangian, derivation and evaluation of the amplitude, its decomposition onto a basis of scalar integrals and their subsequent evaluation, as well as computation of the rational terms.


2021 ◽  
Vol 9 ◽  
Author(s):  
N. Demaria

The High Luminosity Large Hadron Collider (HL-LHC) at CERN will constitute a new frontier for the particle physics after the year 2027. Experiments will undertake a major upgrade in order to stand this challenge: the use of innovative sensors and electronics will have a main role in this. This paper describes the recent developments in 65 nm CMOS technology for readout ASIC chips in future High Energy Physics (HEP) experiments. These allow unprecedented performance in terms of speed, noise, power consumption and granularity of the tracking detectors.


2019 ◽  
Vol 214 ◽  
pp. 02019
Author(s):  
V. Daniel Elvira

Detector simulation has become fundamental to the success of modern high-energy physics (HEP) experiments. For example, the Geant4-based simulation applications developed by the ATLAS and CMS experiments played a major role for them to produce physics measurements of unprecedented quality and precision with faster turnaround, from data taking to journal submission, than any previous hadron collider experiment. The material presented here contains highlights of a recent review on the impact of detector simulation in particle physics collider experiments published in Ref. [1]. It includes examples of applications to detector design and optimization, software development and testing of computing infrastructure, and modeling of physics objects and their kinematics. The cost and economic impact of simulation in the CMS experiment is also presented. A discussion on future detector simulation needs, challenges and potential solutions to address them is included at the end.


2008 ◽  
Vol 01 (01) ◽  
pp. 259-302 ◽  
Author(s):  
Stanley Wojcicki

This article describes the beginnings of the Superconducting Super Collider (SSC). The narrative starts in the early 1980s with the discussion of the process that led to the recommendation by the US high energy physics community to initiate work on a multi-TeV hadron collider. The article then describes the formation in 1984 of the Central Design Group (CDG) charged with directing and coordinating the SSC R&D and subsequent activities which led in early 1987 to the SSC endorsement by President Reagan. The last part of the article deals with the site selection process, steps leading to the initial Congressional appropriation of the SSC construction funds and the creation of the management structure for the SSC Laboratory.


2013 ◽  
Vol 28 (02) ◽  
pp. 1330003 ◽  
Author(s):  
DANIEL GREEN

The Higgs field was first proposed almost 50 years ago. Twenty years ago the tools needed to discover the Higgs boson, the large hadron collider and the CMS and ATLAS experiments, were initiated. Data taking was begun in 2010 and culminated in the announcement of the discovery of a "Higgs-like" boson on 4 July 2012. This discovery completes the Standard Model (SM) of high energy physics, if it is indeed the hypothesized SM Higgs particle. Future data taking will explore the properties of the new 125 GeV particle to see if it has all the attributes of an SM Higgs and to explore the mechanism that maintains its "low" mass.


2021 ◽  
Vol 81 (7) ◽  
Author(s):  
Suyong Choi ◽  
Hayoung Oh

AbstractData-driven methods of background estimations are often used to obtain more reliable descriptions of backgrounds. In hadron collider experiments, data-driven techniques are used to estimate backgrounds due to multi-jet events, which are difficult to model accurately. In this article, we propose an improvement on one of the most widely used data-driven methods in the hadron collision environment, the “ABCD” method of extrapolation. We describe the mathematical background behind the data-driven methods and extend the idea to propose improved general methods.


Sign in / Sign up

Export Citation Format

Share Document