scholarly journals The Impact of Microelectronics on High Energy Physics Innovation: The Role of 65 nm CMOS Technology on New Generation Particle Detectors

2021 ◽  
Vol 9 ◽  
Author(s):  
N. Demaria

The High Luminosity Large Hadron Collider (HL-LHC) at CERN will constitute a new frontier for the particle physics after the year 2027. Experiments will undertake a major upgrade in order to stand this challenge: the use of innovative sensors and electronics will have a main role in this. This paper describes the recent developments in 65 nm CMOS technology for readout ASIC chips in future High Energy Physics (HEP) experiments. These allow unprecedented performance in terms of speed, noise, power consumption and granularity of the tracking detectors.

2019 ◽  
Vol 214 ◽  
pp. 02019
Author(s):  
V. Daniel Elvira

Detector simulation has become fundamental to the success of modern high-energy physics (HEP) experiments. For example, the Geant4-based simulation applications developed by the ATLAS and CMS experiments played a major role for them to produce physics measurements of unprecedented quality and precision with faster turnaround, from data taking to journal submission, than any previous hadron collider experiment. The material presented here contains highlights of a recent review on the impact of detector simulation in particle physics collider experiments published in Ref. [1]. It includes examples of applications to detector design and optimization, software development and testing of computing infrastructure, and modeling of physics objects and their kinematics. The cost and economic impact of simulation in the CMS experiment is also presented. A discussion on future detector simulation needs, challenges and potential solutions to address them is included at the end.


Author(s):  
H. M. Gray

High-energy physics is facing a daunting computing challenge with the large datasets expected from the upcoming High-Luminosity Large Hadron Collider in the next decade and even more so at future colliders. A key challenge in the reconstruction of events of simulated data and collision data is the pattern recognition algorithms used to determine the trajectories of charged particles. The field of quantum computing shows promise for transformative capabilities and is going through a cycle of rapid development and hence might provide a solution to this challenge. This article reviews current studies of quantum computers for charged particle pattern recognition in high-energy physics. This article is part of the theme issue ‘Quantum technologies in particle physics’.


2004 ◽  
Vol 14 (02) ◽  
pp. 379-399 ◽  
Author(s):  
F. FACCIO

With the construction of the Large Hadron Collider at the European Center for Nuclear Research (CERN), the radiation levels at large High Energy Physics (HEP) experiments are significantly increased with respect to past experience. The approach the HEP community is using to ensure radiation tolerance of the electronics installed in these new generation experiments is described. Particular attention is devoted to developments that led to original work: the estimate of the SEU rate in the complex LHC radiation environment and the use of hardness by design techniques to achieve radiation hardness of ASICs in a commercial CMOS technology.


2021 ◽  
Vol 251 ◽  
pp. 03051
Author(s):  
Ali Hariri ◽  
Darya Dyachkova ◽  
Sergei Gleyzer

Accurate and fast simulation of particle physics processes is crucial for the high-energy physics community. Simulating particle interactions with the detector is both time consuming and computationally expensive. With its proton-proton collision energy of 13 TeV, the Large Hadron Collider is uniquely positioned to detect and measure the rare phenomena that can shape our knowledge of new interactions. The High-Luminosity Large Hadron Collider (HLLHC) upgrade will put a significant strain on the computing infrastructure and budget due to increased event rate and levels of pile-up. Simulation of highenergy physics collisions needs to be significantly faster without sacrificing the physics accuracy. Machine learning approaches can offer faster solutions, while maintaining a high level of fidelity. We introduce a graph generative model that provides effiective reconstruction of LHC events on the level of calorimeter deposits and tracks, paving the way for full detector level fast simulation.


2018 ◽  
Vol 68 (1) ◽  
pp. 291-312 ◽  
Author(s):  
Celine Degrande ◽  
Valentin Hirschi ◽  
Olivier Mattelaer

The automation of one-loop amplitudes plays a key role in addressing several computational challenges for hadron collider phenomenology: They are needed for simulations including next-to-leading-order corrections, which can be large at hadron colliders. They also allow the exact computation of loop-induced processes. A high degree of automation has now been achieved in public codes that do not require expert knowledge and can be widely used in the high-energy physics community. In this article, we review many of the methods and tools used for the different steps of automated one-loop amplitude calculations: renormalization of the Lagrangian, derivation and evaluation of the amplitude, its decomposition onto a basis of scalar integrals and their subsequent evaluation, as well as computation of the rational terms.


2020 ◽  
pp. 183-203
Author(s):  
M. Brugger ◽  
H. Burkhardt ◽  
B. Goddard ◽  
F. Cerutti ◽  
R. G. Alia

AbstractWith the exceptions of Synchrotron Radiation sources, beams of accelerated particles are generally designed to interact either with one another (in the case of colliders) or with a specific target (for the operation of Fixed Target experiments, the production of secondary beams and for medical applications). However, in addition to the desired interactions there are unwanted interactions of the high energy particles which can produce undesirable side effects. These interactions can arise from the unavoidable presence of residual gas in the accelerator vacuum chamber, or from the impact of particles lost from the beam on aperture limits around the accelerator, as well as the final beam dump. The wanted collisions of the beams in a collider to produce potentially interesting High Energy Physics events also reduces the density of the circulating beam and can produce high fluxes of secondary particles.


Energies ◽  
2020 ◽  
Vol 13 (14) ◽  
pp. 3569
Author(s):  
Simone Cammarata ◽  
Gabriele Ciarpi ◽  
Stefano Faralli ◽  
Philippe Velha ◽  
Guido Magazzù ◽  
...  

Optical links are rapidly becoming pervasive in the readout chains of particle physics detector systems. Silicon photonics (SiPh) stands as an attractive candidate to sustain the radiation levels foreseen in the next-generation experiments, while guaranteeing, at the same time, multi-Gb/s and energy-efficient data transmission. Integrated electronic drivers are needed to enable SiPh modulators’ deployment in compact on-detector front-end modules. A current-mode logic-based driver harnessing a pseudo-differential output stage is proposed in this work to drive different types of SiPh devices by means of the same circuit topology. The proposed driver, realized in a 65 nm bulk technology and already tested to behave properly up to an 8 MGy total ionizing dose, is hybridly integrated in this work with a lumped-element Mach–Zehnder modulator (MZM) and a ring modulator (RM), both fabricated in a 130 nm silicon-on-insulator (SOI) process. Bit-error-rate (BER) performances confirm the applicability of the selected architecture to either differential and single-ended loads. A 5 Gb/s data rate, in line with the current high energy physics requirements, is achieved in the RM case, while a packaging-related performance degradation is captured in the MZM-based system, confirming the importance of interconnection modeling.


2016 ◽  
Vol 3 (2) ◽  
pp. 252-256 ◽  
Author(s):  
Ling Wang ◽  
Mu-ming Poo

Abstract On 8 March 2012, Yifang Wang, co-spokesperson of the Daya Bay Experiment and Director of Institute of High Energy Physics, Chinese Academy of Sciences, announced the discovery of a new type of neutrino oscillation with a surprisingly large mixing angle (θ13), signifying ‘a milestone in neutrino research’. Now his team is heading for a new goal—to determine the neutrino mass hierarchy and to precisely measure oscillation parameters using the Jiangmen Underground Neutrino Observatory, which is due for completion in 2020. Neutrinos are fundamental particles that play important roles in both microscopic particle physics and macroscopic universe evolution. The studies on neutrinos, for example, may answer the question why our universe consists of much more matter than antimatter. But this is not an easy task. Though they are one of the most numerous particles in the universe and zip through our planet and bodies all the time, these tiny particles are like ‘ghost’, difficult to be captured. There are three flavors of neutrinos, known as electron neutrino (νe), muon neutrino (νμ), and tau neutrino (ντ), and their flavors could change as they travel through space via quantum interference. This phenomenon is known as neutrino oscillation or neutrino mixing. To determine the absolute mass of each type of neutrino and find out how they mix is very challenging. In a recent interview with NSR in Beijing, Yifang Wang explained how the Daya Bay Experiment on neutrino oscillation not only addressed the frontier problem in particle physics, but also harnessed the talents and existing technology in Chinese physics community. This achievement, Wang reckons, will not be an exception in Chinese high energy physics, when appropriate funding and organization for big science projects could be efficiently realized in the future.


2020 ◽  
Vol 35 (18) ◽  
pp. 2030006 ◽  
Author(s):  
Goran Senjanović

I reflect on some of the basic aspects of present day Beyond the Standard Model particle physics, focusing mostly on the issues of naturalness, in particular on the so-called hierarchy problem. To all of us, physics as natural science emerged with Galileo and Newton, and led to centuries of unparalleled success in explaining and often predicting new phenomena of nature. I argue here that the long-standing obsession with the hierarchy problem as a guiding principle for the future of our field has had the tragic consequence of deviating high energy physics from its origins as natural philosophy, and turning it into a philosophy of naturalness.


1988 ◽  
Vol 03 (04) ◽  
pp. 751-823 ◽  
Author(s):  
TORBJÖRN SJÖSTRAND

Phenomenological models of multiparticle production have become increasingly important for the interpretation of experimental data in high energy physics. The evolution of these models fills a gap left open by the present limited theoretical understanding of the hadronization process, i.e. the transformation of outgoing colored partons into color singlet hadrons. The three main schools of thought, string fragmentation, cluster fragmentation and independent fragmentation, are presented in this paper. Included are discussions on similarities and differences, successes and failures, and recent developments. Perturbative QCD aspects with strong ties to the multiparticle production picture are also covered, in particular parton showers. An account is given of experience gained in the comparison between data and models. Since fragmentation studies are particularly well developed for e+e− annihilation events, this field is described in detail. A few comments are also presented for leptoproduction and hadron collisions.


Sign in / Sign up

Export Citation Format

Share Document