A high-speed DAQ framework for future high-level trigger and event building clusters

2017 ◽  
Vol 12 (03) ◽  
pp. C03015-C03015 ◽  
Author(s):  
M. Caselle ◽  
L.E. Ardila Perez ◽  
M. Balzer ◽  
T. Dritschler ◽  
A. Kopmann ◽  
...  
Keyword(s):  
2019 ◽  
Vol 214 ◽  
pp. 08007
Author(s):  
Edoardo Martelli ◽  
Loïc Brarda ◽  
Luis Granado Cardoso ◽  
Marc Collignon ◽  
Niko Neufeld ◽  
...  

In 2017 the CERN IT department and the major LHC experiments reviewed the possibility to build a common High Level Trigger facility, to share costs and for a more efficient utilization of computing resources. This paper describes a Proof-of-Concept built by the CERN IT-CS group and the LHCb online and offline teams to demonstrate the feasibility of the proposal: the dedicated, high speed connection between the LHCb detector and the IT data-centre, the setup of the servers and how they were used for offline simulation and online data taking, the different technologies considered and used, the challenges encountered and how they were tackled. The Proof of Concept exceed expectations and the remote servers were used for real data taking for more than two months during the summer 2018.


2019 ◽  
Vol 214 ◽  
pp. 05010 ◽  
Author(s):  
Giulio Eulisse ◽  
Piotr Konopka ◽  
Mikolaj Krzewicki ◽  
Matthias Richter ◽  
David Rohr ◽  
...  

ALICE is one of the four major LHC experiments at CERN. When the accelerator enters the Run 3 data-taking period, starting in 2021, ALICE expects almost 100 times more Pb-Pb central collisions than now, resulting in a large increase of data throughput. In order to cope with this new challenge, the collaboration had to extensively rethink the whole data processing chain, with a tighter integration between Online and Offline computing worlds. Such a system, code-named ALICE O2, is being developed in collaboration with the FAIR experiments at GSI. It is based on the ALFA framework which provides a generalized implementation of the ALICE High Level Trigger approach, designed around distributed software entities coordinating and communicating via message passing. We will highlight our efforts to integrate ALFA within the ALICE O2 environment. We analyze the challenges arising from the different running environments for production and development, and conclude on requirements for a flexible and modular software framework. In particular we will present the ALICE O2 Data Processing Layer which deals with ALICE specific requirements in terms of Data Model. The main goal is to reduce the complexity of development of algorithms and managing a distributed system, and by that leading to a significant simplification for the large majority of the ALICE users.


2009 ◽  
Author(s):  
R. Covarelli ◽  
Marvin L. Marshak
Keyword(s):  

2021 ◽  
Author(s):  
Malene Hovgaard Vested ◽  
Erik Damgaard Christensen

Abstract The forces on marine and offshore structures are often affected by spilling breakers. The spilling breaker is characterized by a roller of mixed air and water with a forward speed approximately equal to the wave celerity. This high speed in the top of the wave has the potential to induce high wave loads on upper parts of the structures. This study analyzed the effect of the air content on the forces. The analyses used the Morison equation to examine the effect of the percentage of air on the forces. An experimental set-up was developed to include the injection of air into an otherwise calm water body. The air-injection did introduce a high level a turbulence. It was possible to assess the amount of air content in the water for different amounts of air-injection. In the mixture of air and water the force on an oscillating square cylinder was measured for different levels of air-content, — also in the case without air. The measurements indicated that force coefficients for clear water could be use in the Morison equation as long as the density for water was replaced by the density for the mixture of air and water.


2019 ◽  
Vol 183 ◽  
pp. 261-275 ◽  
Author(s):  
Boliang Lin ◽  
Jianping Wu ◽  
Ruixi Lin ◽  
Jiaxi Wang ◽  
Hui Wang ◽  
...  

2020 ◽  
Vol 245 ◽  
pp. 07044
Author(s):  
Frank Berghaus ◽  
Franco Brasolin ◽  
Alessandro Di Girolamo ◽  
Marcus Ebert ◽  
Colin Roy Leavett-Brown ◽  
...  

The Simulation at Point1 (Sim@P1) project was built in 2013 to take advantage of the ATLAS Trigger and Data Acquisition High Level Trigger (HLT) farm. The HLT farm provides around 100,000 cores, which are critical to ATLAS during data taking. When ATLAS is not recording data, such as the long shutdowns of the LHC, this large compute resource is used to generate and process simulation data for the experiment. At the beginning of the second long shutdown of the large hadron collider, the HLT farm including the Sim@P1 infrastructure was upgraded. Previous papers emphasised the need for simple, reliable, and efficient tools and assessed various options to quickly switch between data acquisition operation and offline processing. In this contribution, we describe the new mechanisms put in place for the opportunistic exploitation of the HLT farm for offline processing and give the results from the first months of operation.


2020 ◽  
Vol 245 ◽  
pp. 01031
Author(s):  
Thiago Rafael Fernandez Perez Tomei

The CMS experiment has been designed with a two-level trigger system: the Level-1 Trigger, implemented on custom-designed electronics, and the High Level Trigger, a streamlined version of the CMS offline reconstruction software running on a computer farm. During its second phase the LHC will reach a luminosity of 7.5 1034 cm−2 s−1 with a pileup of 200 collisions, producing integrated luminosity greater than 3000 fb−1 over the full experimental run. To fully exploit the higher luminosity, the CMS experiment will introduce a more advanced Level-1 Trigger and increase the full readout rate from 100 kHz to 750 kHz. CMS is designing an efficient data-processing hardware trigger that will include tracking information and high-granularity calorimeter information. The current Level-1 conceptual design is expected to take full advantage of advances in FPGA and link technologies over the coming years, providing a high-performance, low-latency system for large throughput and sophisticated data correlation across diverse sources. The higher luminosity, event complexity and input rate present an unprecedented challenge to the High Level Trigger that aims to achieve a similar efficiency and rejection factor as today despite the higher pileup and more pure preselection. In this presentation we will discuss the ongoing studies and prospects for the online reconstruction and selection algorithms for the high-luminosity era.


Sign in / Sign up

Export Citation Format

Share Document