scholarly journals Phenomenology of a supersymmetric model inspired by inflation

2021 ◽  
Vol 81 (2) ◽  
Author(s):  
Wolfgang Gregor Hollik ◽  
Cheng Li ◽  
Gudrid Moortgat-Pick ◽  
Steven Paasch

AbstractThe current challenges in high energy physics and cosmology are to build coherent particle physics models to describe the phenomenology at colliders in the laboratory and the observations in the universe. From these observations, the existence of an inflationary phase in the early universe gives guidance for particle physics models. We study a supersymmetric model which incorporates successfully inflation by a non-minimal coupling to supergravity and shows a unique collider phenomenology. Motivated by experimental data, we set a special emphasis on a new singlet-like state at $$97\,\text {GeV}$$ 97 GeV and single out possible observables for a future linear collider that permit a distinction of the model from a similar scenario without inflation. We define a benchmark scenario that is in agreement with current collider and Dark Matter constraints, and study the influence of the non-minimal coupling on the phenomenology. Measuring the singlet-like state with high precision on the percent level seems to be promising for resolving the models, even though the Standard Model-like Higgs couplings deviate only marginally. However, a hypothetical singlet-like state with couplings of about $$20\,\%$$ 20 % compared to a Standard Model Higgs at $$97\,\text {GeV}$$ 97 GeV encourages further studies of such footprint scenarios of inflation.


2018 ◽  
Vol 33 (20) ◽  
pp. 1830017 ◽  
Author(s):  
Pran Nath

We give here an overview of recent developments in high energy physics and cosmology and their interconnections that relate to unification, and discuss prospects for the future. Thus there are currently three empirical data that point to supersymmetry as an underlying symmetry of particle physics: the unification of gauge couplings within supersymmetry, the fact that nature respects the supersymmetry prediction that the Higgs boson mass lie below 130 GeV, and vacuum stability up to the Planck scale with a Higgs boson mass at [Formula: see text][Formula: see text]125 GeV while the Standard Model does not do that. Coupled with the fact that supersymmetry solves the big hierarchy problem related to the quadratic divergence to the Higgs boson mass square along with the fact that there is no alternative paradigm that allows us to extrapolate physics from the electroweak scale to the grand unification scale consistent with experiment, supersymmetry remains a compelling framework for new physics beyond the Standard Model. The large loop correction to the Higgs boson mass in supersymmetry to lift the tree mass to the experimentally observable value, indicates a larger value of the scale of weak scale supersymmetry, making the observation of sparticles more challenging but still within reach at the LHC for the lightest ones. Recent analyses show that a high energy LHC (HE-LHC) operating at 27 TeV running at its optimal luminosity of [Formula: see text] can reduce the discovery period by several years relative to HL-LHC and significantly extend the reach in parameter space of models. In the coming years several experiments related to neutrino physics, searches for supersymmetry, on dark matter and dark energy will have direct impact on the unification frontier. Thus the discovery of sparticles will establish supersymmetry as a fundamental symmetry of nature and also lend direct support for strings. Further, discovery of sparticles associated with missing energy will constitute discovery of dark matter with LSP being the dark matter. On the cosmology front more accurate measurement of the equation of state, i.e. [Formula: see text], will shed light on the nature of dark energy. Specifically, [Formula: see text] will likely indicate the existence of a dynamical field, possibly quintessence, responsible for dark energy and [Formula: see text] would indicate an entirely new sector of physics. Further, more precise measurements of the ratio [Formula: see text] of tensor to scalar power spectrum, of the scalar and tensor spectral indices [Formula: see text] and [Formula: see text] and of non-Gaussianity will hopefully allow us to realize a Standard Model of inflation. These results will be a guide to further model building that incorporates unification of particle physics and cosmology.



2000 ◽  
Vol 15 (16) ◽  
pp. 2347-2353
Author(s):  
CLEMENS A. HEUSCH

It has become a natural mandate to the particle physics community to look beyond presently active and approved high-energy accelerators for precision work at the edge of, and beyond, the Standard Model. We stress the mandate for complementary use of both charge modes of the Electron Collider, e+e- and e+e-. Choosing a few illustrative examples, we attempt to set the stage for technical developments needed to define and execute the key experiments using two incoming electron beams.



2020 ◽  
Vol 245 ◽  
pp. 08026
Author(s):  
Leonid Serkin

The ATLAS Collaboration is releasing a new set of proton–proton collision data to the public for educational purposes. The data was collected by the ATLAS detector at the Large Hadron Collider at a centre-of-mass energy √s = 13 TeV during the year 2016 and corresponds to an integrated luminosity of 10 fb−1. This dataset is accompanied by simulated events describing several Standard Model processes, as well as hypothetical Beyond Standard Model signal processes. Associated computing tools are provided to make the analysis of the dataset easily accessible. In the following, we summarise the properties of the 13 TeV ATLAS Open Data set and the available analysis tools. Several examples intended as a starting point for further analysis work by users are shown. The general aim of the dataset and tools released is to provide user-friendly and straightforward interactive interfaces to replicate the procedures used by high-energy-physics researchers and enable users to experience the analysis of particle-physics data in educational environments.



2006 ◽  
Vol 84 (6-7) ◽  
pp. 419-435 ◽  
Author(s):  
D Scott

The Standard Model of Particle Physics (SMPP) is an enormously successful description of high-energy physics, driving ever more precise measurements to find "physics beyond the standard model", as well as providing motivation for developing more fundamental ideas that might explain the values of its parameters. Simultaneously, a description of the entire three-dimensional structure of the present-day Universe is being built up painstakingly. Most of the structure is stochastic in nature, being merely the result of the particular realization of the "initial conditions" within our observable Universe patch. However, governing this structure is the Standard Model of Cosmology (SMC), which appears to require only about a dozen parameters. Cosmologists are now determining the values of these quantities with increasing precision to search for "physics beyond the standard model", as well as trying to develop an understanding of the more fundamental ideas that might explain the values of its parameters. Although it is natural to see analogies between the two Standard Models, some intrinsic differences also exist, which are discussed here. Nevertheless, a truly fundamental theory will have to explain both the SMPP and SMC, and this must include an appreciation of which elements are deterministic and which are accidental. Considering different levels of stochasticity within cosmology may make it easier to accept that physical parameters in general might have a nondeterministic aspect. PACS Nos.: 98.80.–k, 98.80.Bp, 98.80.Es, 12.60.–i



2004 ◽  
Vol 19 (02) ◽  
pp. 179-204 ◽  
Author(s):  
I. HINCHLIFFE ◽  
N. KERSTING ◽  
Y. L. MA

We present a pedagogical review of particle physics models that are based on the noncommutativity of space–time, [Formula: see text], with specific attention to the phenomenology these models predict in particle experiments either in existence or under development. We summarize results obtained for high energy scattering such as would occur, for example, in a future e+e-linear collider with [Formula: see text], as well as low energy experiments such as those pertaining to elementary electric dipole moments and other CP violating observables, and finally comment on the status of phenomenological work in cosmology and extra dimensions.



2018 ◽  
Vol 68 (1) ◽  
pp. 291-312 ◽  
Author(s):  
Celine Degrande ◽  
Valentin Hirschi ◽  
Olivier Mattelaer

The automation of one-loop amplitudes plays a key role in addressing several computational challenges for hadron collider phenomenology: They are needed for simulations including next-to-leading-order corrections, which can be large at hadron colliders. They also allow the exact computation of loop-induced processes. A high degree of automation has now been achieved in public codes that do not require expert knowledge and can be widely used in the high-energy physics community. In this article, we review many of the methods and tools used for the different steps of automated one-loop amplitude calculations: renormalization of the Lagrangian, derivation and evaluation of the amplitude, its decomposition onto a basis of scalar integrals and their subsequent evaluation, as well as computation of the rational terms.



2003 ◽  
Vol 14 (09) ◽  
pp. 1273-1278 ◽  
Author(s):  
MICHAEL KLASEN

The Feynman diagram generator FeynArts and the computer algebra program FormCalc allow for an automatic computation of 2→2 and 2→3 scattering processes in High Energy Physics. We have extended this package by four new kinematical routines and adapted one existing routine in order to accomodate also two- and three-body decays of massive particles. This makes it possible to compute automatically two- and three-body particle decay widths and decay energy distributions as well as resonant particle production within the Standard Model and the Minimal Supersymmetric Standard Model at the tree- and loop-level. The use of the program is illustrated with three standard examples: [Formula: see text], [Formula: see text], and [Formula: see text].



2021 ◽  
Vol 9 ◽  
Author(s):  
N. Demaria

The High Luminosity Large Hadron Collider (HL-LHC) at CERN will constitute a new frontier for the particle physics after the year 2027. Experiments will undertake a major upgrade in order to stand this challenge: the use of innovative sensors and electronics will have a main role in this. This paper describes the recent developments in 65 nm CMOS technology for readout ASIC chips in future High Energy Physics (HEP) experiments. These allow unprecedented performance in terms of speed, noise, power consumption and granularity of the tracking detectors.



10.14311/1718 ◽  
2013 ◽  
Vol 53 (1) ◽  
Author(s):  
Aleksander Filip Żarnecki ◽  
Lech Wiktor Piotrowski ◽  
Lech Mankiewicz ◽  
Sebastian Małek

The Luiza analysis framework for GLORIA is based on the Marlin package, which was originally developed for data analysis in the new High Energy Physics (HEP) project, International Linear Collider (ILC). The HEP experiments have to deal with enormous amounts of data and distributed data analysis is therefore essential. The Marlin framework concept seems to be well suited for the needs of GLORIA. The idea (and large parts of the code) taken from Marlin is that every computing task is implemented as a processor (module) that analyzes the data stored in an internal data structure, and the additional output is also added to that collection. The advantage of this modular approach is that it keeps things as simple as possible. Each step of the full analysis chain, e.g. from raw images to light curves, can be processed step-by-step, and the output of each step is still self consistent and can be fed in to the next step without any manipulation.



2019 ◽  
Vol 214 ◽  
pp. 02019
Author(s):  
V. Daniel Elvira

Detector simulation has become fundamental to the success of modern high-energy physics (HEP) experiments. For example, the Geant4-based simulation applications developed by the ATLAS and CMS experiments played a major role for them to produce physics measurements of unprecedented quality and precision with faster turnaround, from data taking to journal submission, than any previous hadron collider experiment. The material presented here contains highlights of a recent review on the impact of detector simulation in particle physics collider experiments published in Ref. [1]. It includes examples of applications to detector design and optimization, software development and testing of computing infrastructure, and modeling of physics objects and their kinematics. The cost and economic impact of simulation in the CMS experiment is also presented. A discussion on future detector simulation needs, challenges and potential solutions to address them is included at the end.



Sign in / Sign up

Export Citation Format

Share Document