scholarly journals Improved extrapolation methods of data-driven background estimations in high energy physics

2021 ◽  
Vol 81 (7) ◽  
Author(s):  
Suyong Choi ◽  
Hayoung Oh

AbstractData-driven methods of background estimations are often used to obtain more reliable descriptions of backgrounds. In hadron collider experiments, data-driven techniques are used to estimate backgrounds due to multi-jet events, which are difficult to model accurately. In this article, we propose an improvement on one of the most widely used data-driven methods in the hadron collision environment, the “ABCD” method of extrapolation. We describe the mathematical background behind the data-driven methods and extend the idea to propose improved general methods.

2018 ◽  
Vol 68 (1) ◽  
pp. 291-312 ◽  
Author(s):  
Celine Degrande ◽  
Valentin Hirschi ◽  
Olivier Mattelaer

The automation of one-loop amplitudes plays a key role in addressing several computational challenges for hadron collider phenomenology: They are needed for simulations including next-to-leading-order corrections, which can be large at hadron colliders. They also allow the exact computation of loop-induced processes. A high degree of automation has now been achieved in public codes that do not require expert knowledge and can be widely used in the high-energy physics community. In this article, we review many of the methods and tools used for the different steps of automated one-loop amplitude calculations: renormalization of the Lagrangian, derivation and evaluation of the amplitude, its decomposition onto a basis of scalar integrals and their subsequent evaluation, as well as computation of the rational terms.


2021 ◽  
Vol 9 ◽  
Author(s):  
N. Demaria

The High Luminosity Large Hadron Collider (HL-LHC) at CERN will constitute a new frontier for the particle physics after the year 2027. Experiments will undertake a major upgrade in order to stand this challenge: the use of innovative sensors and electronics will have a main role in this. This paper describes the recent developments in 65 nm CMOS technology for readout ASIC chips in future High Energy Physics (HEP) experiments. These allow unprecedented performance in terms of speed, noise, power consumption and granularity of the tracking detectors.


2019 ◽  
Vol 214 ◽  
pp. 02019
Author(s):  
V. Daniel Elvira

Detector simulation has become fundamental to the success of modern high-energy physics (HEP) experiments. For example, the Geant4-based simulation applications developed by the ATLAS and CMS experiments played a major role for them to produce physics measurements of unprecedented quality and precision with faster turnaround, from data taking to journal submission, than any previous hadron collider experiment. The material presented here contains highlights of a recent review on the impact of detector simulation in particle physics collider experiments published in Ref. [1]. It includes examples of applications to detector design and optimization, software development and testing of computing infrastructure, and modeling of physics objects and their kinematics. The cost and economic impact of simulation in the CMS experiment is also presented. A discussion on future detector simulation needs, challenges and potential solutions to address them is included at the end.


2008 ◽  
Vol 01 (01) ◽  
pp. 259-302 ◽  
Author(s):  
Stanley Wojcicki

This article describes the beginnings of the Superconducting Super Collider (SSC). The narrative starts in the early 1980s with the discussion of the process that led to the recommendation by the US high energy physics community to initiate work on a multi-TeV hadron collider. The article then describes the formation in 1984 of the Central Design Group (CDG) charged with directing and coordinating the SSC R&D and subsequent activities which led in early 1987 to the SSC endorsement by President Reagan. The last part of the article deals with the site selection process, steps leading to the initial Congressional appropriation of the SSC construction funds and the creation of the management structure for the SSC Laboratory.


2013 ◽  
Vol 28 (02) ◽  
pp. 1330003 ◽  
Author(s):  
DANIEL GREEN

The Higgs field was first proposed almost 50 years ago. Twenty years ago the tools needed to discover the Higgs boson, the large hadron collider and the CMS and ATLAS experiments, were initiated. Data taking was begun in 2010 and culminated in the announcement of the discovery of a "Higgs-like" boson on 4 July 2012. This discovery completes the Standard Model (SM) of high energy physics, if it is indeed the hypothesized SM Higgs particle. Future data taking will explore the properties of the new 125 GeV particle to see if it has all the attributes of an SM Higgs and to explore the mechanism that maintains its "low" mass.


2014 ◽  
Vol 25 (02) ◽  
pp. 1430001 ◽  
Author(s):  
ALBERTO PACE

In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.


Physics ◽  
2020 ◽  
Vol 2 (3) ◽  
pp. 455-480
Author(s):  
Airton Deppman ◽  
Eugenio Megías ◽  
Débora P. P. Menezes

In this work, we provide an overview of the recent investigations on the non-extensive Tsallis statistics and its applications to high energy physics and astrophysics, including physics at the Large Hadron Collider (LHC), hadron physics, and neutron stars. We review some recent investigations on the power-law distributions arising in high energy physics experiments focusing on a thermodynamic description of the system formed, which could explain the power-law behavior. The possible connections with a fractal structure of hadrons is also discussed. The main objective of the present work is to delineate the state-of-the-art of those studies and show some open issues that deserve more careful investigation. We propose several possibilities to test the theory through analyses of experimental data.


2020 ◽  
Vol 226 ◽  
pp. 01007
Author(s):  
Alexei Klimentov ◽  
Douglas Benjamin ◽  
Alessandro Di Girolamo ◽  
Kaushik De ◽  
Johannes Elmsheuser ◽  
...  

The ATLAS experiment at CERN’s Large Hadron Collider uses theWorldwide LHC Computing Grid, the WLCG, for its distributed computing infrastructure. Through the workload management system PanDA and the distributed data management system Rucio, ATLAS provides seamless access to hundreds of WLCG grid and cloud based resources that are distributed worldwide, to thousands of physicists. PanDA annually processes more than an exabyte of data using an average of 350,000 distributed batch slots, to enable hundreds of new scientific results from ATLAS. However, the resources available to the experiment have been insufficient to meet ATLAS simulation needs over the past few years as the volume of data from the LHC has grown. The problem will be even more severe for the next LHC phases. High Luminosity LHC will be a multiexabyte challenge where the envisaged Storage and Compute needs are a factor 10 to 100 above the expected technology evolution. The High Energy Physics (HEP) community needs to evolve current computing and data organization models in order to introduce changes in the way it uses and manages the infrastructure, focused on optimizations to bring performance and efficiency not forgetting simplification of operations. In this paper we highlight recent R&D projects in HEP related to data lake prototype, federated data storage and data carousel.


2011 ◽  
Vol 26 (05) ◽  
pp. 309-317
Author(s):  
◽  
DAN GREEN

The Large Hadron Collider (LHC) began 7 TeV C.M. energy operation in April, 2010. The CMS experiment immediately analyzed the earliest data taken in order to "rediscover" the Standard Model (SM) of high energy physics. By the late summer, all SM particles were observed and CMS began to search for physics beyond the SM and beyond the present limits set at the Fermilab Tevatron. The first LHC run ended in Dec., 2010 with a total integrated luminosity of about 45 pb-1 delivered to the experiments.


Sign in / Sign up

Export Citation Format

Share Document