scholarly journals Let’s get our hands dirty: a comprehensive evaluation of DAQDB, key-value store for petascale hot storage

2020 ◽  
Vol 245 ◽  
pp. 10004
Author(s):  
Adam Abed Abud ◽  
Danilo Cicalese ◽  
Grzegorz Jereczek ◽  
Fabrice Le Goff ◽  
Giovanna Lehmann Miotto ◽  
...  

Data acquisition systems are a key component for successful data taking in any experiment. The DAQ is a complex distributed computing system and coordinates all operations, from the data selection stage of interesting events to storage elements. For the High Luminosity upgrade of the Large Hadron Collider, the experiments at CERN need to meet challenging requirements to record data with a much higher occupancy in the detectors. The DAQ system will receive and deliver data with a significantly increased trigger rate, one million events per second, and capacity, terabytes of data per second. An effective way to meet these requirements is to decouple real-time data acquisition from event selection. Data fragments can be temporarily stored in a large distributed key-value store. Fragments belonging to the same event can be then queried on demand, by the data selection processes. Implementing such a model relies on a proper combination of emerging technologies, such as persistent memory, NVMe SSDs, scalable networking, and data structures, as well as high performance, scalable software. In this paper, we present DAQDB (Data Acquisition Database) — an open source implementation of this design that was presented earlier, with an extensive evaluation of this approach, from the single node to the distributed performance. Furthermore, we complement our study with a description of the challenges faced and the lessons learned while integrating DAQDB with the existing software framework of the ATLAS experiment.

2019 ◽  
Vol 214 ◽  
pp. 01033 ◽  
Author(s):  
Teo Mrnjavac ◽  
Vasco Chibante Barroso

The ALICE Experiment at CERN LHC (Large Hadron Collider) is under preparation for a major upgrade that is scheduled to be deployed during Long Shutdown 2 in 2019-2020 and that includes new computing systems, called O2 (Online-Offine). To ensure the efficient operation of the upgraded experiment along with its newly designed computing system, a reliable, high performance and automated control system will be developed with the goal of managingthe lifetime of all the O2 processes, and of handling the various phases of the data taking activity by interacting with the detectors, the trigger system and the LHC. The ALICE O2 control system will be a distributed systembased on state of the art cluster management and microservices which have recently emerged in the distributed computing ecosystem. Such technologies weren’t available during the design and development of the original LHC computing systems, and their use will allow the ALICE collaboration to benefit from a vibrant and innovatingopen source community. This paper illustrates the O2 control system architecture. It evaluates several olutionsthat were considered during an initial prototyping phase and provides a rationale for the choices made. It also provides an in-depth overview of the components, features and design elements of the actual system.


2021 ◽  
Vol 251 ◽  
pp. 04013
Author(s):  
Adam Abed Abud ◽  
Kurt Biery ◽  
Carlos Chavez ◽  
Pengfei Ding ◽  
Eric Flumerfelt ◽  
...  

The DUNE detector is a neutrino physics experiment that is expected to take data starting from 2028. The data acquisition (DAQ) system of the experiment is designed to sustain several TB/s of incoming data which will be temporarily buffered while being processed by a software based data selection system. In DUNE, some rare physics processes (e.g. Supernovae Burst events) require storing the full complement of data produced over 1-2 minute window. These are recognised by the data selection system which fires a specific trigger decision. Upon reception of this decision data are moved from the temporary buffers to local, high performance, persistent storage devices. In this paper we characterize the performance of novel 3DXPoint SSD devices under different workloads suitable for high-performance storage applications. We then illustrate how such devices may be applied to the DUNE use-case: to store, upon a specific signal, 100 seconds of incoming data at 1.5 TB/s distributed among 150 identical units each operating at approximately 10GB/s.


2020 ◽  
Vol 245 ◽  
pp. 01033
Author(s):  
Teo Mrnjavac ◽  
Konstantinos Alexopoulos ◽  
Vasco Chibante Barroso ◽  
George Raduta

The ALICE Experiment at CERN’s Large Hadron Collider (LHC) is undertaking a major upgrade during LHC Long Shutdown 2 in 2019-2021, which includes a new computing system called O2 (Online-Offline). To ensure the efficient operation of the upgraded experiment and of its newly designed computing system, a reliable, high performance, and automated experiment control system is being developed. The ALICE Experiment Control System (AliECS) is a distributed system based on state of the art cluster management and microservices that have recently emerged in the distributed computing ecosystem. Such technologies will allow the ALICE collaboration to benefit from a vibrant and innovating open source community. This communication describes the AliECS architecture. It provides an in-depth overview of the system’s components, features, and design elements, as well as its performance. It also reports on the experience with AliECS as part of ALICE Run 3 detector commissioning setups.


2016 ◽  
Vol 11 (1) ◽  
pp. 72-80
Author(s):  
O.V. Darintsev ◽  
A.B. Migranov

In article one of possible approaches to synthezis of group control of mobile robots which is based on use of cloud computing is considered. Distinctive feature of the offered techniques is adequate reflection of specifics of a scope and the robots of tasks solved by group in architecture of control-information systems, methods of the organization of information exchange, etc. The approach offered by authors allows to increase reliability and robustness of collectives of robots, to lower requirements to airborne computers when saving summary high performance in general.


2020 ◽  
Vol 16 ◽  
Author(s):  
Luxia Zheng ◽  
Xiong Shen ◽  
Yingchun Wang ◽  
Jian Liang ◽  
Mingming Xu ◽  
...  

Background: Phospholipids are widely used in food and pharmaceutical industry as functional excipients. In spite of the many analytical methods reported, there are very limited reports concerning systematic research and comparison of phospholipid excipients. Objective: To present a comprehensive evaluation of commercial natural phospholipid excipients (CNPEs). Methods: Seventeen batches of CNPEs from five manufacturing enterprises, isolated either from soybean or egg yolk, were investigated. The content and composition of phospholipids, fatty acids and sterols as a whole were considered as the evaluative index of CNPEs. Eight kinds of phospholipids were determined by supercritical fluid chromatography (SFC), twenty-one kinds of fatty acids were determined by gas chromatography (GC) after boron trifluoride-methanol derivatization, and nine kinds of sterols were determined by high performance liquid chromatography (HPLC) after separation and derivatization of the unsaponifiable matter. Cluster analysis was employed for classification and identification of the CNPEs. Results: The results showed that each kind of CNPEs had its characteristic content and composition of phospholipids, fatty acids and sterols. Seventeen batches of samples were divided into eight groups in cluster analysis. CNPEs of the same type from different source (soybean or egg yolk) or enterprises presented different content and composition of phospholipids, fatty acids and sterols. Conclusion: Each type of CNPEs had its characteristic content and composition of phospholipid, fatty acid and sterol. The compositions of phospholipid, fatty acid and sterol as a whole can be applied as an indicator of the quality and characteristics for CNPEs.


2018 ◽  
Vol 935 (5) ◽  
pp. 54-63
Author(s):  
A.A. Maiorov ◽  
A.V. Materuhin ◽  
I.N. Kondaurov

Geoinformation technologies are now becoming “end-to-end” technologies of the new digital economy. There is a need for solutions for efficient processing of spatial and spatio-temporal data that could be applied in various sectors of this new economy. Such solutions are necessary, for example, for cyberphysical systems. Essential components of cyberphysical systems are high-performance and easy-scalable data acquisition systems based on smart geosensor networks. This article discusses the problem of choosing a software environment for this kind of systems, provides a review and a comparative analysis of various open source software environments designed for large spatial data and spatial-temporal data streams processing in computer clusters. It is shown that the software framework STARK can be used to process spatial-temporal data streams in spatial-temporal data streams. An extension of the STARK class system based on the type system for spatial-temporal data streams developed by one of the authors of this article is proposed. The models and data representations obtained as a result of the proposed expansion can be used not only for processing spatial-temporal data streams in data acquisition systems based on smart geosensor networks, but also for processing spatial-temporal data streams in various purposes geoinformation systems that use processing data in computer clusters.


J ◽  
2021 ◽  
Vol 4 (2) ◽  
pp. 147-153
Author(s):  
Paula Morella ◽  
María Pilar Lambán ◽  
Jesús Antonio Royo ◽  
Juan Carlos Sánchez

Among the new trends in technology that have emerged through the Industry 4.0, Cyber Physical Systems (CPS) and Internet of Things (IoT) are crucial for the real-time data acquisition. This data acquisition, together with its transformation in valuable information, are indispensable for the development of real-time indicators. Moreover, real-time indicators provide companies with a competitive advantage over the competition since they enhance the calculus and speed up the decision-making and failure detection. Our research highlights the advantages of real-time data acquisition for supply chains, developing indicators that would be impossible to achieve with traditional systems, improving the accuracy of the existing ones and enhancing the real-time decision-making. Moreover, it brings out the importance of integrating technologies 4.0 in industry, in this case, CPS and IoT, and establishes the main points for a future research agenda of this topic.


Sign in / Sign up

Export Citation Format

Share Document