scholarly journals Evolution of the ROOT Tree I/O

2020 ◽  
Vol 245 ◽  
pp. 02030
Author(s):  
Jakob Blomer ◽  
Philippe Canal ◽  
Axel Naumann ◽  
Danilo Piparo

The ROOT TTree data format encodes hundreds of petabytes of High Energy and Nuclear Physics events. Its columnar layout drives rapid analyses, as only those parts (“branches”) that are really used in a given analysis need to be read from storage. Its unique feature is the seamless C++ integration, which allows users to directly store their event classes without explicitly defining data schemas. In this contribution, we present the status and plans of the future ROOT 7 event I/O. Along with the ROOT 7 interface modernization, we aim for robust, where possible compile-time safe C++ interfaces to read and write event data. On the performance side, we show first benchmarks using ROOT’s new experimental I/O subsystem that combines the best of TTrees with recent advances in columnar data formats. A core ingredient is a strong separation of the high-level logical data layout (C++ classes) from the low-level physical data layout (storage backed nested vectors of simple types). We show how the new, optimized physical data layout speeds up serialization and deserialization and facilitates parallel, vectorized and bulk operations. This lets ROOT I/O run optimally on the upcoming ultra-fast NVRAM storage devices, as well as file-less storage systems such as object stores.

2020 ◽  
Vol 245 ◽  
pp. 06024
Author(s):  
Jérôme Lauret ◽  
Juan Gonzalez ◽  
Gene Van Buren ◽  
Rafael Nuñez ◽  
Philippe Canal ◽  
...  

For the last 5 years Accelogic pioneered and perfected a radically new theory of numerical computing codenamed “Compressive Computing”, which has an extremely profound impact on real-world computer science [1]. At the core of this new theory is the discovery of one of its fundamental theorems which states that, under very general conditions, the vast majority (typically between 70% and 80%) of the bits used in modern large-scale numerical computations are absolutely irrelevant for the accuracy of the end result. This theory of Compressive Computing provides mechanisms able to identify (with high intelligence and surgical accuracy) the number of bits (i.e., the precision) that can be used to represent numbers without affecting the substance of the end results, as they are computed and vary in real time. The bottom line outcome would be to provide a state-of-the-art compression algorithm that surpasses those currently available in the ROOT framework, with the purpose of enabling substantial economic and operational gains (including speedup) for High Energy and Nuclear Physics data storage/analysis. In our initial studies, a factor of nearly x4 (3.9) compression was achieved with RHIC/STAR data where ROOT compression managed only x1.4. In this contribution, we will present our concepts of “functionally lossless compression”, have a glance at examples and achievements in other communities, present the results and outcome of our current, ongoing R&D, as well as present a high-level view of our plan to move forward with a ROOT implementation that would deliver a basic solution readily integrated into HENP applications. As a collaboration of experimental scientists, private industry, and the ROOT Team, our aim is to capitalize on the substantial success delivered by the initial effort and produce a robust technology properly packaged as an open-source tool that could be used by virtually every experiment around the world as means for improving data management and accessibility.


2005 ◽  
Vol 20 (16) ◽  
pp. 3777-3782 ◽  
Author(s):  
IVAN VITEV

The status of RHIC theory and phenomenology is reviewed with an emphasis on the indications for the creation of a new deconfined state of matter. The critical role of high energy nuclear physics in the development of theoretical tools that address various aspects of the QCD many body dynamics is highlighted. The perspectives for studying nuclear matter under even more extreme conditions at the LHC and the overlap with high energy physics is discussed.


2021 ◽  
Vol 251 ◽  
pp. 03038
Author(s):  
Antonio Augusto Alves ◽  
Maximilian Reininghaus ◽  
André Schmidt ◽  
Remy Prechelt ◽  
Ralf Ulrich ◽  
...  

The CORSIKA 8 project is an international collaboration of scientists working together to deliver the most modern, flexible, robust and efficient framework for the simulation of ultra-high energy secondary particle cascades in matter. The main application is for cosmic ray air shower simulations, but it can also be applied to other problems in astro(particle)-physics, particle physics and nuclear physics. Besides a comprehensive and state-of-the-art collection of physics models as well as algorithms relevant for the field, also all possible interfaces to hardware acceleration (e.g. GPU) and parallelization (vectorization, multi-threading, multi-core) will be provided. We present the status and roadmap of this project. This code will soon be available for novel explorative studies and phenomonological research, and at the same time for massive productions runs for experiments.


2020 ◽  
Vol 245 ◽  
pp. 06015
Author(s):  
Thomas Britton ◽  
David Lawrence ◽  
Gagik Gavalian

Charged particle tracking represents the largest consumer of CPU resources in high data volume Nuclear Physics (NP) experiments. An effort is underway to develop machine learning (ML) networks that will reduce the resources required for charged particle tracking. Tracking in NP experiments represent some unique challenges compared to high energy physics (HEP). In particular, track finding typically represents only a small fraction of the overall tracking problem in NP. This presentation will outline the differences and similarities between NP and HEP charged particle tracking and areas where ML learning may provide a benefit. The status of the specific effort taking place at Jefferson Lab will also be shown.


Author(s):  
M. Gabriele ◽  
M. Previtali

Abstract. The proprietary software investments in the data integration field are incrementing, and the progresses are visible in the possibility to directly open in a GIS environment a 3D software data format. Still, this is limited to the integration between the proprietary data formats and standards, ArcGIS environment shapefile multipatch and Revit 3D model, by using a proprietary software (ArcGIS). This study takes advantage of the lesson-learnt results in the proprietary data integration field, wanting to replicate a similar result using the IFC open standard, which is not directly openable by a GIS interface and needs to overcome a conversion that in most of the cases leads to semantic and geometric losses. So, an IFC-to-shapefile data conversion was performed, stressing (i) the way information is stored in the attribute table to query the geometries and perform geoprocessing, by (ii) implementing workarounds to keep the Revit instances’ shared parameters in the IFC file, (iii) meanwhile having a high Level of Detail of the HBIM. The research performed the IFC-to-shapefile data conversion through FME (Feature Manipulation Engine), benefitting of the flexibility of the shapefile format and of the IFC’ possibility to keep a high LOD in the export phase. Both allowed to properly query and manage the elements of an HBIM in a GIS (ArcGIS environment), and, using relational attributes table, retrieve the information contained in each Revit instance’ property panel, as the shared parameters that implement the BIM Level of Information (LOI).


2019 ◽  
Vol 625 ◽  
pp. A10 ◽  
Author(s):  
C. Nigro ◽  
C. Deil ◽  
R. Zanin ◽  
T. Hassan ◽  
J. King ◽  
...  

The analysis and combination of data from different gamma-ray instruments involves the use of collaboration proprietary software and case-by-case methods. The effort of defining a common data format for high-level data, namely event lists and instrument response functions (IRFs), has recently started for very-high-energy gamma-ray instruments, driven by the upcoming Cherenkov Telescope Array (CTA). In this work we implemented this prototypical data format for a small set of MAGIC, VERITAS, FACT, and H.E.S.S. Crab nebula observations, and we analyzed them with the open-source gammapy software package. By combining data from Fermi-LAT, and from four of the currently operating imaging atmospheric Cherenkov telescopes, we produced a joint maximum likelihood fit of the Crab nebula spectrum. Aspects of the statistical errors and the evaluation of systematic uncertainty are also commented upon, along with the release format of spectral measurements. The results presented in this work are obtained using open-access on-line assets that allow for a long-term reproducibility of the results.


2021 ◽  
Vol 54 (2) ◽  
pp. 1-35
Author(s):  
Chenning Li ◽  
Zhichao Cao ◽  
Yunhao Liu

With the development of the Internet of Things (IoT), many kinds of wireless signals (e.g., Wi-Fi, LoRa, RFID) are filling our living and working spaces nowadays. Beyond communication, wireless signals can sense the status of surrounding objects, known as wireless sensing , with their reflection, scattering, and refraction while propagating in space. In the last decade, many sophisticated wireless sensing techniques and systems were widely studied for various applications (e.g., gesture recognition, localization, and object imaging). Recently, deep Artificial Intelligence (AI), also known as Deep Learning (DL), has shown great success in computer vision. And some works have initially proved that deep AI can benefit wireless sensing as well, leading to a brand-new step toward ubiquitous sensing. In this survey, we focus on the evolution of wireless sensing enhanced by deep AI techniques. We first present a general workflow of Wireless Sensing Systems (WSSs) which consists of signal pre-processing, high-level feature, and sensing model formulation. For each module, existing deep AI-based techniques are summarized, further compared with traditional approaches. Then, we provide a view of issues and challenges induced by combining deep AI and wireless sensing together. Finally, we discuss the future trends of deep AI to enable ubiquitous wireless sensing.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Kazuaki Kisu ◽  
Sangryun Kim ◽  
Takara Shinohara ◽  
Kun Zhao ◽  
Andreas Züttel ◽  
...  

AbstractHigh-energy-density and low-cost calcium (Ca) batteries have been proposed as ‘beyond-Li-ion’ electrochemical energy storage devices. However, they have seen limited progress due to challenges associated with developing electrolytes showing reductive/oxidative stabilities and high ionic conductivities. This paper describes a calcium monocarborane cluster salt in a mixed solvent as a Ca-battery electrolyte with high anodic stability (up to 4 V vs. Ca2+/Ca), high ionic conductivity (4 mS cm−1), and high Coulombic efficiency for Ca plating/stripping at room temperature. The developed electrolyte is a promising candidate for use in room-temperature rechargeable Ca batteries.


2004 ◽  
Vol 19 (02) ◽  
pp. 179-204 ◽  
Author(s):  
I. HINCHLIFFE ◽  
N. KERSTING ◽  
Y. L. MA

We present a pedagogical review of particle physics models that are based on the noncommutativity of space–time, [Formula: see text], with specific attention to the phenomenology these models predict in particle experiments either in existence or under development. We summarize results obtained for high energy scattering such as would occur, for example, in a future e+e-linear collider with [Formula: see text], as well as low energy experiments such as those pertaining to elementary electric dipole moments and other CP violating observables, and finally comment on the status of phenomenological work in cosmology and extra dimensions.


Sign in / Sign up

Export Citation Format

Share Document