Alternate methods for anomaly detection in high-energy physics via semi-supervised learning

2020 ◽  
Vol 35 (23) ◽  
pp. 2050131
Author(s):  
Mohd Adli Md Ali ◽  
Nu’man Badrud’din ◽  
Hafidzul Abdullah ◽  
Faiz Kemi

Recently, the concept of weakly supervised learning has gained popularity in the high-energy physics community due to its ability to learn even with a noisy and impure dataset. This method is valuable in the quest to discover the elusive beyond Standard Model (BSM) particle. Nevertheless, the weakly supervised learning method still requires a learning sample that describes the features of the BSM particle truthfully to the classification model. Even with the various theoretical framework such as supersymmetry and the quantum black hole, creating a BSM sample is not a trivial task since the exact feature of the particle is unknown. Due to these difficulties, we propose an alternative classifier type called the one-class classification (OCC). OCC algorithms require only background or noise samples in its training dataset, which is already abundant in the high-energy physics community. The algorithm will flag any sample that does not fit the background feature as an abnormality. In this paper, we introduce two new algorithms called EHRA and C-EHRA, which use machine learning regression and clustering to detect anomalies in samples. We tested the algorithms’ capability to create distinct anomalous patterns in the presence of BSM samples and also compare their classification output metrics to the Isolation Forest (ISF), a well-known anomaly detection algorithm. Five Monte Carlo supersymmetry datasets with the signal to noise ratio equal to 1, 0.1, 0.01, 0.001, and 0.0001 were used to test EHRA, C-EHRA and ISF algorithm. In our study, we found that the EHRA with an artificial neural network regression has the highest ROC-AUC score at 0.7882 for the balanced dataset, while the C-EHRA has the highest precision-sensitivity score for the majority of the imbalanced datasets. These findings highlight the potential use of the EHRA, C-EHRA, and other OCC algorithms in the quest to discover BSM particles.

2018 ◽  
Vol 68 (1) ◽  
pp. 291-312 ◽  
Author(s):  
Celine Degrande ◽  
Valentin Hirschi ◽  
Olivier Mattelaer

The automation of one-loop amplitudes plays a key role in addressing several computational challenges for hadron collider phenomenology: They are needed for simulations including next-to-leading-order corrections, which can be large at hadron colliders. They also allow the exact computation of loop-induced processes. A high degree of automation has now been achieved in public codes that do not require expert knowledge and can be widely used in the high-energy physics community. In this article, we review many of the methods and tools used for the different steps of automated one-loop amplitude calculations: renormalization of the Lagrangian, derivation and evaluation of the amplitude, its decomposition onto a basis of scalar integrals and their subsequent evaluation, as well as computation of the rational terms.


1995 ◽  
Vol 06 (04) ◽  
pp. 531-540 ◽  
Author(s):  
D. PERRET-GALLIX

Complete Feynman diagram automatic computation systems are now coming of age after many years of development. They are made available to the high energy physics community through user-friendly interfaces. Theorists and experimentalists can benefit from these powerful packages for speeding up time consuming calculations and for preparing event generators. The general architecture of these packages is presented and the current development of the one-loop diagrams extension is discussed. A rapid description of the prominent packages and tools is then proposed. Finally, the necessity for defining a standardization scheme is heavily stressed for the benefit of developers and users.


2019 ◽  
Vol 214 ◽  
pp. 06037
Author(s):  
Moritz Kiehn ◽  
Sabrina Amrouche ◽  
Paolo Calafiura ◽  
Victor Estrade ◽  
Steven Farrell ◽  
...  

The High-Luminosity LHC (HL-LHC) is expected to reach unprecedented collision intensities, which in turn will greatly increase the complexity of tracking within the event reconstruction. To reach out to computer science specialists, a tracking machine learning challenge (TrackML) was set up on Kaggle by a team of ATLAS, CMS, and LHCb physicists tracking experts and computer scientists building on the experience of the successful Higgs Machine Learning challenge in 2014. A training dataset based on a simulation of a generic HL-LHC experiment tracker has been created, listing for each event the measured 3D points, and the list of 3D points associated to a true track.The participants to the challenge should find the tracks in the test dataset, which means building the list of 3D points belonging to each track.The emphasis is to expose innovative approaches, rather than hyper-optimising known approaches. A metric reflecting the accuracy of a model at finding the proper associations that matter most to physics analysis will allow to select good candidates to augment or replace existing algorithms.


2008 ◽  
Vol 01 (01) ◽  
pp. 259-302 ◽  
Author(s):  
Stanley Wojcicki

This article describes the beginnings of the Superconducting Super Collider (SSC). The narrative starts in the early 1980s with the discussion of the process that led to the recommendation by the US high energy physics community to initiate work on a multi-TeV hadron collider. The article then describes the formation in 1984 of the Central Design Group (CDG) charged with directing and coordinating the SSC R&D and subsequent activities which led in early 1987 to the SSC endorsement by President Reagan. The last part of the article deals with the site selection process, steps leading to the initial Congressional appropriation of the SSC construction funds and the creation of the management structure for the SSC Laboratory.


Sign in / Sign up

Export Citation Format

Share Document