scholarly journals The Heavy Photon Search (HPS) Software Environment

2020 ◽  
Vol 245 ◽  
pp. 02018
Author(s):  
Norman Graf

The Heavy Photon Search (HPS) is an experiment at the Thomas Jefferson National Accelerator Facility (JLab) designed to search for a hidden sector photon (A’) in fixed-target electro-production. It uses a silicon microstrip tracking and vertexing detector placed inside a dipole magnet to measure charged particle trajectories and a fast lead-tungstate crystal calorimeter located just downstream of the magnet to provide a trigger and to identify electromagnetic showers. The HPS experiment uses both invariant mass and secondary vertex signatures to search for the A’. The experimental collaboration is small and quite heterogeneous: it is composed of members of the nuclear physics as well as particle physics communities, from universities and national labs from around the US and Europe. Enabling such a disparate group to concentrate on the physics aspects of the experiment required that the software be easy to install and use, and having such limited manpower meant that existing solutions had to be exploited. HPS has successfully completed two engineering runs and completed its first physics run in the summer of 2019. We begin with an overview of the physics goals of the experiment followed by a short description of the detector design. We then describe the software tools used to design the detector layout and simulate the expected detector performance. Event reconstruction involving track, cluster and vertex finding and fitting for both simulated and real data was, to first order, adopted from existing software originally developed for Linear Collider studies. Bringing it all together into a cohesive whole involved the use of multiple software solutions with common interfaces.

2021 ◽  
Vol 16 (6) ◽  
Author(s):  
Daniele P. Anderle ◽  
Valerio Bertone ◽  
Xu Cao ◽  
Lei Chang ◽  
Ningbo Chang ◽  
...  

AbstractLepton scattering is an established ideal tool for studying inner structure of small particles such as nucleons as well as nuclei. As a future high energy nuclear physics project, an Electron-ion collider in China (EicC) has been proposed. It will be constructed based on an upgraded heavy-ion accelerator, High Intensity heavy-ion Accelerator Facility (HIAF) which is currently under construction, together with a new electron ring. The proposed collider will provide highly polarized electrons (with a polarization of ∼80%) and protons (with a polarization of ∼70%) with variable center of mass energies from 15 to 20 GeV and the luminosity of (2–3) × 1033 cm−2 · s−1. Polarized deuterons and Helium-3, as well as unpolarized ion beams from Carbon to Uranium, will be also available at the EicC.The main foci of the EicC will be precision measurements of the structure of the nucleon in the sea quark region, including 3D tomography of nucleon; the partonic structure of nuclei and the parton interaction with the nuclear environment; the exotic states, especially those with heavy flavor quark contents. In addition, issues fundamental to understanding the origin of mass could be addressed by measurements of heavy quarkonia near-threshold production at the EicC. In order to achieve the above-mentioned physics goals, a hermetical detector system will be constructed with cutting-edge technologies.This document is the result of collective contributions and valuable inputs from experts across the globe. The EicC physics program complements the ongoing scientific programs at the Jefferson Laboratory and the future EIC project in the United States. The success of this project will also advance both nuclear and particle physics as well as accelerator and detector technology in China.


2006 ◽  
Vol 05 (02) ◽  
pp. E
Author(s):  
Nico Pitrelli

The American particle physics community is in jeopardy and may end up drowning in a boundless sea trying to grasp at non-existing funds, dragging US physics and science as a whole to the bottom. This is a price the most powerful and high-tech country of the world cannot afford, as warned by the editors of a report published in late April by the National Academy of Sciences1. Behind so much alarm is the International Linear Collider (ILC) – a large particle accelerator facility which, according to the report, should be built on American territory, if research on the elementary constituents of nature is to survive in the United States. The ILC will probably cost a total of five hundred million dollars in the first five years, whereas billions will have to be invested in the subsequent seven years. Hardly impressive, however, if compared with the Superconducting Super Collider (SSC), the biggest and costliest machine ever conceived in the history of science. Devised to describe the first instants of the universe, as many will recall, the SSC project was severely hampered by political and bureaucratic plots in 1993, when the Clinton administration decided to halt work on the accelerator, after ten years and approximately two billion dollars already spent.


Universe ◽  
2021 ◽  
Vol 7 (3) ◽  
pp. 72
Author(s):  
Clementina Agodi ◽  
Antonio D. Russo ◽  
Luciano Calabretta ◽  
Grazia D’Agostino ◽  
Francesco Cappuzzello ◽  
...  

The search for neutrinoless double-beta (0νββ) decay is currently a key topic in physics, due to its possible wide implications for nuclear physics, particle physics, and cosmology. The NUMEN project aims to provide experimental information on the nuclear matrix elements (NMEs) that are involved in the expression of 0νββ decay half-life by measuring the cross section of nuclear double-charge exchange (DCE) reactions. NUMEN has already demonstrated the feasibility of measuring these tiny cross sections for some nuclei of interest for the 0νββ using the superconducting cyclotron (CS) and the MAGNEX spectrometer at the Laboratori Nazionali del Sud (LNS.) Catania, Italy. However, since the DCE cross sections are very small and need to be measured with high sensitivity, the systematic exploration of all nuclei of interest requires major upgrade of the facility. R&D for technological tools has been completed. The realization of new radiation-tolerant detectors capable of sustaining high rates while preserving the requested resolution and sensitivity is underway, as well as the upgrade of the CS to deliver beams of higher intensity. Strategies to carry out DCE cross-section measurements with high-intensity beams were developed in order to achieve the challenging sensitivity requested to provide experimental constraints to 0νββ NMEs.


2021 ◽  
Vol 5 (1) ◽  
Author(s):  
Erik Buhmann ◽  
Sascha Diefenbacher ◽  
Engin Eren ◽  
Frank Gaede ◽  
Gregor Kasieczka ◽  
...  

AbstractAccurate simulation of physical processes is crucial for the success of modern particle physics. However, simulating the development and interaction of particle showers with calorimeter detectors is a time consuming process and drives the computing needs of large experiments at the LHC and future colliders. Recently, generative machine learning models based on deep neural networks have shown promise in speeding up this task by several orders of magnitude. We investigate the use of a new architecture—the Bounded Information Bottleneck Autoencoder—for modelling electromagnetic showers in the central region of the Silicon-Tungsten calorimeter of the proposed International Large Detector. Combined with a novel second post-processing network, this approach achieves an accurate simulation of differential distributions including for the first time the shape of the minimum-ionizing-particle peak compared to a full Geant4 simulation for a high-granularity calorimeter with 27k simulated channels. The results are validated by comparing to established architectures. Our results further strengthen the case of using generative networks for fast simulation and demonstrate that physically relevant differential distributions can be described with high accuracy.


2004 ◽  
Vol 19 (02) ◽  
pp. 179-204 ◽  
Author(s):  
I. HINCHLIFFE ◽  
N. KERSTING ◽  
Y. L. MA

We present a pedagogical review of particle physics models that are based on the noncommutativity of space–time, [Formula: see text], with specific attention to the phenomenology these models predict in particle experiments either in existence or under development. We summarize results obtained for high energy scattering such as would occur, for example, in a future e+e-linear collider with [Formula: see text], as well as low energy experiments such as those pertaining to elementary electric dipole moments and other CP violating observables, and finally comment on the status of phenomenological work in cosmology and extra dimensions.


2020 ◽  
pp. 171-254
Author(s):  
Hermann Kolanoski ◽  
Norbert Wermes

Detectors that record charged particles through their ionisation of gases are found in many experiments of nuclear and particle physics. By conversion of the charges created along a track into electrical signals, particle trajectories can be measured with these detectors in large volumes, also inside magnetic fields. The operation principles of gaseous detectors are explained, which include charge generation, gas amplification, operation modes and gas mixtures. Different detector types are described in some detail, starting with ionisation chambers without gas amplification, proceeding to those with gas amplification like spark and streamer chambers, parallel plate arrangements, multi-wire proportional chambers, chambers with microstructured electrodes, drift chambers, and ending with time-projection chambers. The chapter closes with an overview of aging effects in gaseous detectors which cause negative alterations of the detector performance.


2012 ◽  
Vol 2012 ◽  
pp. 1-38 ◽  
Author(s):  
Andrea Giuliani ◽  
Alfredo Poves

This paper introduces the neutrinoless double-beta decay (the rarest nuclear weak process) and describes the status of the research for this transition, both from the point of view of theoretical nuclear physics and in terms of the present and future experimental scenarios. Implications of this phenomenon on crucial aspects of particle physics are briefly discussed. The calculations of the nuclear matrix elements in case of mass mechanisms are reviewed, and a range for these quantities is proposed for the most appealing candidates. After introducing general experimental concepts—such as the choice of the best candidates, the different proposed technological approaches, and the sensitivity—we make the point on the experimental situation. Searches running or in preparation are described, providing an organic presentation which picks up similarities and differences. A critical comparison of the adopted technologies and of their physics reach (in terms of sensitivity to the effective Majorana neutrino mass) is performed. As a conclusion, we try to envisage what we expect round the corner and at a longer time scale.


Physics ◽  
2019 ◽  
Vol 1 (3) ◽  
pp. 375-391 ◽  
Author(s):  
Robin Smith ◽  
Jack Bishop

We present an open-source kinematic fitting routine designed for low-energy nuclear physics applications. Although kinematic fitting is commonly used in high-energy particle physics, it is rarely used in low-energy nuclear physics, despite its effectiveness. A FORTRAN and ROOT C++ version of the FUNKI_FIT kinematic fitting code have been developed and published open access. The FUNKI_FIT code is universal in the sense that the constraint equations can be easily modified to suit different experimental set-ups and reactions. Two case studies for the use of this code, utilising experimental and Monte–Carlo data, are presented: (1) charged-particle spectroscopy using silicon-strip detectors; (2) charged-particle spectroscopy using active target detectors. The kinematic fitting routine provides an improvement in resolution in both cases, demonstrating, for the first time, the applicability of kinematic fitting across a range of nuclear physics applications. The ROOT macro has been developed in order to easily apply this technique in standard data analysis routines used by the nuclear physics community.


Sign in / Sign up

Export Citation Format

Share Document