2015 ◽  
Vol 13 (4) ◽  
pp. 511-521 ◽  
Author(s):  
M. Battistin ◽  
S. Berry ◽  
A. Bitadze ◽  
P. Bonneau ◽  
J. Botelho-Direito ◽  
...  

Abstract The silicon tracker of the ATLAS experiment at CERN Large Hadron Collider will operate around –15°C to minimize the effects of radiation damage. The present cooling system is based on a conventional evaporative circuit, removing around 60 kW of heat dissipated by the silicon sensors and their local electronics. The compressors in the present circuit have proved less reliable than originally hoped, and will be replaced with a thermosiphon. The working principle of the thermosiphon uses gravity to circulate the coolant without any mechanical components (compressors or pumps) in the primary coolant circuit. The fluorocarbon coolant will be condensed at a temperature and pressure lower than those in the on-detector evaporators, but at a higher altitude, taking advantage of the 92 m height difference between the underground experiment and the services located on the surface. An extensive campaign of tests, detailed in this paper, was performed using two small-scale thermosiphon systems. These tests confirmed the design specifications of the full-scale plant and demonstrated operation over the temperature range required for ATLAS. During the testing phase the system has demonstrated unattended long-term stable running over a period of several weeks. The commissioning of the full scale thermosiphon is ongoing, with full operation planned for late 2015.


2021 ◽  
Vol 251 ◽  
pp. 04019
Author(s):  
Andrei Kazarov ◽  
Adrian Chitan ◽  
Andrei Kazymov ◽  
Alina Corso-Radu ◽  
Igor Aleksandrov ◽  
...  

The ATLAS experiment at the Large Hadron Collider (LHC) operated very successfully in the years 2008 to 2018, in two periods identified as Run 1 and Run 2. ATLAS achieved an overall data-taking efficiency of 94%, largely constrained by the irreducible dead-time introduced to accommodate the limitations of the detector read-out electronics. Out of the 6% dead-time only about 15% could be attributed to the central trigger and DAQ system, and out of these, a negligible fraction was due to the Control and Configuration subsystem. Despite these achievements, and in order to improve even more the already excellent efficiency of the whole DAQ system in the coming Run 3, a new campaign of software updates was launched for the second long LHC shutdown (LS2). This paper presents, using a few selected examples, how the work was approached and which new technologies were introduced into the ATLAS Control and Configuration software. Despite these being specific to this system, many solutions can be considered and adapted to different distributed DAQ systems.


2005 ◽  
Vol 20 (16) ◽  
pp. 3871-3873 ◽  
Author(s):  
DAVID MALON

Each new generation of collider experiments confronts the challenge of delivering an event store having at least the performance and functionality of current-generation stores, in the presence of an order of magnitude more data and new computing paradigms (object orientation just a few years ago; grid and service-based computing today). The ATLAS experiment at the Large Hadron Collider, for example, will produce 1.6-megabyte events at 200 Hz–an annual raw data volume of 3.2 petabytes. With derived and simulated data, the total volume may approach 10 petabytes per year. Scale, however, is not the only challenge. In the Large Hadron Collider (LHC) experiments, the preponderance of computing power will come from outside the host laboratory. More significantly, no single site will host a complete copy of the event store–data will be distributed, not simply replicated for convenience, and many physics analyses will routinely require distributed (grid) computing. This paper uses the emerging ATLAS computing model to provide a glimpse of how next-generation event stores are taking shape, touching on key issues in navigation, distribution, scale, coherence, data models and representation, metadata infrastructure, and the role(s) of databases in event store management.


2013 ◽  
Vol 22 (07) ◽  
pp. 1330015
Author(s):  
◽  
DOMIZIA ORESTANO

This document presents a brief overview of some of the experimental techniques employed by the ATLAS experiment at the CERN Large Hadron Collider (LHC) in the search for the Higgs boson predicted by the standard model (SM) of particle physics. The data and the statistical analyses that allowed in July 2012, only few days before this presentation at the Marcel Grossman Meeting, to firmly establish the observation of a new particle are described. The additional studies needed to check the consistency between the newly discovered particle and the Higgs boson are also discussed.


Sign in / Sign up

Export Citation Format

Share Document