High energy physics, past, present and future

2017 ◽  
Vol 32 (09) ◽  
pp. 1741017
Author(s):  
Hirotaka Sugawara

At the beginning of last century we witnessed the emergence of new physics, quantum theory and gravitational theory, which gave us correct understanding of the world of atoms and deep insight into the structure of universe we live in. Towards the end of the century, string theory emerged as the most promising candidate to unify these two theories. In this talk, I would like to assert that the understanding of the origin of physical constants, [Formula: see text] (Planck constant) for quantum theory, and G (Newton’s gravitational constant) for gravitational theory within the framework of string theory is the key to understanding string theory. Then, I will shift to experimental high energy physics and discuss the necessity of world-wide collaboration in the area of superconducting technology which is essential in constructing the 100 TeV hadron collider.

2019 ◽  
Vol 34 (02) ◽  
pp. 1930002 ◽  
Author(s):  
George Wei-Shu Hou

This brief review grew out from the HEP concluding talk of the 25th Anniversary of the Rencontres du Vietnam, held in August 2018, at Quy Nhon. The first two-thirds gives a summary and highlights, or snapshot, of High Energy Physics at the end of Large Hadron Collider (LHC) Run 2. It can be viewed as the combined effort of the program organizers, the invited plenary speakers, and finally filtered into the present mosaic. It certainly should not be viewed as comprehensive. In the second one-third, a more personal perspective and outlook is given, including my take on the flavor anomalies, and why the next three years, the period of Long Shutdown 2 plus first year (or more) of LHC Run 3, would be bright and flavorful, with much hope for uncovering New Physics. We advocate extra Yukawa couplings as the most likely, next, New Physics to be tested, the effect of which is already written in our Matter Universe.


2021 ◽  
Vol 251 ◽  
pp. 02070
Author(s):  
Matthew Feickert ◽  
Lukas Heinrich ◽  
Giordon Stark ◽  
Ben Galewsky

In High Energy Physics facilities that provide High Performance Computing environments provide an opportunity to efficiently perform the statistical inference required for analysis of data from the Large Hadron Collider, but can pose problems with orchestration and efficient scheduling. The compute architectures at these facilities do not easily support the Python compute model, and the configuration scheduling of batch jobs for physics often requires expertise in multiple job scheduling services. The combination of the pure-Python libraries pyhf and funcX reduces the common problem in HEP analyses of performing statistical inference with binned models, that would traditionally take multiple hours and bespoke scheduling, to an on-demand (fitting) “function as a service” that can scalably execute across workers in just a few minutes, offering reduced time to insight and inference. We demonstrate execution of a scalable workflow using funcX to simultaneously fit 125 signal hypotheses from a published ATLAS search for new physics using pyhf with a wall time of under 3 minutes. We additionally show performance comparisons for other physics analyses with openly published probability models and argue for a blueprint of fitting as a service systems at HPC centers.


Author(s):  
Richard Healey

The metaphor that fundamental physics is concerned to say what the natural world is like at the deepest level may be cashed out in terms of entities, properties, or laws. The role of quantum field theories in the Standard Model of high-energy physics suggests that fundamental entities, properties, and laws are to be sought in these theories. But the contextual ontology proposed in Chapter 12 would support no unified compositional structure for the world; a quantum state assignment specifies no physical property distribution sufficient even to determine all physical facts; and quantum theory posits no fundamental laws of time evolution, whether deterministic or stochastic. Quantum theory has made a revolutionary contribution to fundamental physics because its principles have permitted tremendous unification of science through the successful application of models constructed in conformity to them: but these models do not say what the world is like at the deepest level.


2018 ◽  
Vol 68 (1) ◽  
pp. 291-312 ◽  
Author(s):  
Celine Degrande ◽  
Valentin Hirschi ◽  
Olivier Mattelaer

The automation of one-loop amplitudes plays a key role in addressing several computational challenges for hadron collider phenomenology: They are needed for simulations including next-to-leading-order corrections, which can be large at hadron colliders. They also allow the exact computation of loop-induced processes. A high degree of automation has now been achieved in public codes that do not require expert knowledge and can be widely used in the high-energy physics community. In this article, we review many of the methods and tools used for the different steps of automated one-loop amplitude calculations: renormalization of the Lagrangian, derivation and evaluation of the amplitude, its decomposition onto a basis of scalar integrals and their subsequent evaluation, as well as computation of the rational terms.


2021 ◽  
Vol 9 ◽  
Author(s):  
N. Demaria

The High Luminosity Large Hadron Collider (HL-LHC) at CERN will constitute a new frontier for the particle physics after the year 2027. Experiments will undertake a major upgrade in order to stand this challenge: the use of innovative sensors and electronics will have a main role in this. This paper describes the recent developments in 65 nm CMOS technology for readout ASIC chips in future High Energy Physics (HEP) experiments. These allow unprecedented performance in terms of speed, noise, power consumption and granularity of the tracking detectors.


2019 ◽  
Vol 214 ◽  
pp. 02019
Author(s):  
V. Daniel Elvira

Detector simulation has become fundamental to the success of modern high-energy physics (HEP) experiments. For example, the Geant4-based simulation applications developed by the ATLAS and CMS experiments played a major role for them to produce physics measurements of unprecedented quality and precision with faster turnaround, from data taking to journal submission, than any previous hadron collider experiment. The material presented here contains highlights of a recent review on the impact of detector simulation in particle physics collider experiments published in Ref. [1]. It includes examples of applications to detector design and optimization, software development and testing of computing infrastructure, and modeling of physics objects and their kinematics. The cost and economic impact of simulation in the CMS experiment is also presented. A discussion on future detector simulation needs, challenges and potential solutions to address them is included at the end.


2016 ◽  
Vol 25 (09) ◽  
pp. 1641022 ◽  
Author(s):  
Emanuele Berti ◽  
Vitor Cardoso ◽  
Luis C. B. Crispino ◽  
Leonardo Gualtieri ◽  
Carlos Herdeiro ◽  
...  

We review recent progress in the application of numerical relativity techniques to astrophysics and high-energy physics. We focus on recent developments regarding the spin evolution in black hole binaries, high-energy black hole collisions, compact object solutions in scalar–tensor gravity, superradiant instabilities, hairy black hole solutions in Einstein’s gravity coupled to fundamental fields, and the possibility to gain insight into these phenomena using analog gravity models.


2005 ◽  
Vol 20 (14) ◽  
pp. 3021-3032
Author(s):  
Ian M. Fisk

In this review, the computing challenges facing the current and next generation of high energy physics experiments will be discussed. High energy physics computing represents an interesting infrastructure challenge as the use of large-scale commodity computing clusters has increased. The causes and ramifications of these infrastructure challenges will be outlined. Increasing requirements, limited physical infrastructure at computing facilities, and limited budgets have driven many experiments to deploy distributed computing solutions to meet the growing computing needs for analysis reconstruction, and simulation. The current generation of experiments have developed and integrated a number of solutions to facilitate distributed computing. The current work of the running experiments gives an insight into the challenges that will be faced by the next generation of experiments and the infrastructure that will be needed.


2008 ◽  
Vol 01 (01) ◽  
pp. 259-302 ◽  
Author(s):  
Stanley Wojcicki

This article describes the beginnings of the Superconducting Super Collider (SSC). The narrative starts in the early 1980s with the discussion of the process that led to the recommendation by the US high energy physics community to initiate work on a multi-TeV hadron collider. The article then describes the formation in 1984 of the Central Design Group (CDG) charged with directing and coordinating the SSC R&D and subsequent activities which led in early 1987 to the SSC endorsement by President Reagan. The last part of the article deals with the site selection process, steps leading to the initial Congressional appropriation of the SSC construction funds and the creation of the management structure for the SSC Laboratory.


2013 ◽  
Vol 28 (02) ◽  
pp. 1330003 ◽  
Author(s):  
DANIEL GREEN

The Higgs field was first proposed almost 50 years ago. Twenty years ago the tools needed to discover the Higgs boson, the large hadron collider and the CMS and ATLAS experiments, were initiated. Data taking was begun in 2010 and culminated in the announcement of the discovery of a "Higgs-like" boson on 4 July 2012. This discovery completes the Standard Model (SM) of high energy physics, if it is indeed the hypothesized SM Higgs particle. Future data taking will explore the properties of the new 125 GeV particle to see if it has all the attributes of an SM Higgs and to explore the mechanism that maintains its "low" mass.


Sign in / Sign up

Export Citation Format

Share Document