scholarly journals Implementation and performance of a DPM federated storage and integration within the ATLAS environment

2020 ◽  
Vol 245 ◽  
pp. 04045
Author(s):  
Claire Adam Bourdarios ◽  
Jean-Claude Chevaleyre ◽  
Frédérique Chollet ◽  
Sabine Crépé-Renaudin ◽  
Christine Gondrand ◽  
...  

With the increase of storage needs at the High-Luminosity LHC horizon, data management and access will be very challenging. The evaluation of possible solutions within the WLCG Data Organization, Management and Access (DOMA) is a major activity to select the most optimal from the experiment and site point of views. Four teams hosting Tier-2s for ATLAS with storage based on DPM technology have put their expertise and computing infrastructures in common to build a testbed hosting a DPM federated storage called FR-ALPES. This note describes the infrastructure put in place, its integration within the ATLAS Grid infrastructure and presents the first results.

2020 ◽  
Vol 245 ◽  
pp. 07027
Author(s):  
Santiago González de la Hoz ◽  
Carles Acosta-Silva ◽  
Javier Aparisi Pozo ◽  
Jose del Peso ◽  
Álvaro Fernández Casani ◽  
...  

The ATLAS Spanish Tier-1 and Tier-2s have more than 15 years of experience in the deployment and development of LHC computing components and their successful operations. The sites are already actively participating in, and even coordinating, emerging R&D computing activities and developing new computing models needed for the Run3 and HighLuminosity LHC periods. In this contribution, we present details on the integration of new components, such as High Performance Computing resources to execute ATLAS simulation workflows. The development of new techniques to improve efficiency in a cost-effective way, such as storage and CPU federations is shown in this document. Improvements in data organization, management and access through storage consolidations (“data-lakes”), the use of data caches, and improving experiment data catalogs, like Event Index, are explained in this proceeding. The design and deployment of new analysis facilities using GPUs together with CPUs and techniques like Machine Learning will also be presented. Tier-1 and Tier-2 sites, are, and will be, contributing to significant R&D in computing, evaluating different models for improving performance of computing and data storage capacity in the High-Luminosity LHC era.


2019 ◽  
Vol 2019 ◽  
pp. 1-12 ◽  
Author(s):  
Hyun Jae Baek ◽  
Min Hye Chang ◽  
Jeong Heo ◽  
Kwang Suk Park

Brain-computer interfaces (BCIs) aim to enable people to interact with the external world through an alternative, nonmuscular communication channel that uses brain signal responses to complete specific cognitive tasks. BCIs have been growing rapidly during the past few years, with most of the BCI research focusing on system performance, such as improving accuracy or information transfer rate. Despite these advances, BCI research and development is still in its infancy and requires further consideration to significantly affect human experience in most real-world environments. This paper reviews the most recent studies and findings about ergonomic issues in BCIs. We review dry electrodes that can be used to detect brain signals with high enough quality to apply in BCIs and discuss their advantages, disadvantages, and performance. Also, an overview is provided of the wide range of recent efforts to create new interface designs that do not induce fatigue or discomfort during everyday, long-term use. The basic principles of each technique are described, along with examples of current applications in BCI research. Finally, we demonstrate a user-friendly interface paradigm that uses dry capacitive electrodes that do not require any preparation procedure for EEG signal acquisition. We explore the capacitively measured steady-state visual evoked potential (SSVEP) response to an amplitude-modulated visual stimulus and the auditory steady-state response (ASSR) to an auditory stimulus modulated by familiar natural sounds to verify their availability for BCI. We report the first results of an online demonstration that adopted this ergonomic approach to evaluating BCI applications. We expect BCI to become a routine clinical, assistive, and commercial tool through advanced EEG monitoring techniques and innovative interface designs.


2007 ◽  
Vol 659 (2) ◽  
pp. 997-1007 ◽  
Author(s):  
Shai Kaspi ◽  
W. N. Brandt ◽  
Dan Maoz ◽  
Hagai Netzer ◽  
Donald P. Schneider ◽  
...  

2014 ◽  
Vol 31 ◽  
pp. 1460295
Author(s):  
Zhijun Liang ◽  

We report on the operation and performance of the ATLAS Semi-Conductor Tracker (SCT), which has been functioning for 3 years in the high luminosity, high radiation environment of the Large Hadron Collider at CERN. We also report on the few improvements of the SCT foreseen for the high energy run of the LHC. We find 99.3% of the SCT modules are operational, the noise occupancy and hit efficiency exceed the design specifications; the alignment is very close to the ideal to allow on-line track reconstruction and invariant mass determination. We will report on the operation and performance of the detector including an overview of the issues encountered. We observe a significant increase in leakage currents from bulk damage due to non-ionizing radiation and make comparisons with the predictions.


2013 ◽  
Vol 753-755 ◽  
pp. 3136-3139 ◽  
Author(s):  
Tong Zhang ◽  
Shuai Tian ◽  
Ling Xu ◽  
Hong Yan Chen

Oriented education "outstanding engineers training plan", combined with the characteristics of university students after 90, teaching method required in the civil engineering undergraduate experiment is studied. To the "principle of concrete structure" experimental teaching reform as an example, it builds an organic experiment teaching platform with theoretical teaching across the curriculum, concrete structure associated , the "Yang section flexural bearing capacity of reinforced concrete demonstration experiments" optimize the comprehensive design experiments, has been clear about the experiment of hardware, software, data organization, management, etc. Thus it provides a certain reference basis for excellence in civil engineering undergraduate experiment teaching.


2014 ◽  
Vol 7 (1) ◽  
pp. 769-817
Author(s):  
H. M. J. Barbosa ◽  
B. Barja ◽  
T. Pauliquevis ◽  
D. A. Gouveia ◽  
P. Artaxo ◽  
...  

Abstract. A permanent UV Raman Lidar station, designed to perform continuous measurements of aerosols and water vapor and aiming to study and monitor the atmosphere on the weather to climatic time scales, became operational in central Amazon in July 2011. The automated data acquisition and internet monitoring enabled extended hours of daily measurements when compared to a manually operated instrument. This paper gives a technical description of the system, presents its experimental characterization and the algorithms used for obtaining the aerosol optical properties and identifying the cloud layers. Data from one week of measurements during the dry season of 2011 were analyzed as a mean to assess the overall system capability and performance. A comparison of the aerosol optical depth from the Lidar and a co-located AERONET sun photometer showed a root mean square error of about 0.06, small compared to the range of observed AOD values (0.1 to 0.75) and to the typical AERONET AOD uncertainty (0.02). By combining nighttime measurements of the aerosol lidar ratio (50–65 sr), backtrajectories calculations and fire spots observed from satellites we showed that observed particles originated from biomass burning. Cirrus clouds were observed in 60% of our measurements. Most of the time they were distributed into three layers between 11.5 and 13.4 km a.g.l. The systematic and long-term measurements being made by this new scientific facility have the potential to significantly improve our understanding of the climatic implications of the anthropogenic changes in aerosol concentrations over the pristine Amazônia.


2020 ◽  
Vol 245 ◽  
pp. 02003
Author(s):  
Gilles Grasseau ◽  
Abhinav Kumar ◽  
Andrea Sartirana ◽  
Artur Lobanov ◽  
Florian Beaudette

For the High Luminosity LHC, the CMS collaboration made the ambitious choice of a high granularity design to replace the existing endcap calorimeters. Thousands of particles coming from the multiple interactions create showers in the calorimeters, depositing energy simultaneously in adjacent cells. The data are similar to 3D gray-scale image that should be properly reconstructed. In this paper, we investigate how to localize and identify the thousands of showers in such events with a Deep Neural Network model. This problem is well-known in the “Vision” domain, it belongs to the challenging class: “Object Detection”. Our project shares a lot of similarities with the ones treated in Industry but faces several technological challenges like the 3D treatment. We present the Mask R-CNN model which has already proven its efficiency in Industry (for 2D images). We also present the first results and our plans to extend it to tackle 3D HGCAL data.


1979 ◽  
Vol 18 (04) ◽  
pp. 199-202 ◽  
Author(s):  
F. Lustman ◽  
P. Lanthier ◽  
D. Charbonneau

A patient-oriented data management system is described. The environment was cardiology with a heavy emphasis on research and the MEDIC system was designed to meet the day to day program needs. The data are organized in speciality files with dynamic patient records composed of subrecords of different types. The schema is described by a data definition language. Application packages include data quality control, medical reporting and general inquiry.After five years of extensive use in various clinical applications, its utility has been assessed as well as its low cost. The disadvantages, the main being the multifile structure, can now be stated as its advantages, like data independence and performance increase. Although the system is now partially outdated, the experience acquired with its use becomes very helpful in the selection process of the future database management system.


2021 ◽  
Author(s):  
Ivonne Anders ◽  
Swati Gehlot ◽  
Andrea Lammert ◽  
Karsten Peters-von Gehlen

<p>Since few years Research Data Management is becoming an increasingly important part of scientific projects regardless of the number of topics or subjects, researchers or institutions involved. The bigger the project, the more are the data organization and data management requirements in order to assure the best outcome of the project. Despite this, projects rarely have clear structures or responsibilities for data management. The importance of clearly defining data management and also budgeting for it is often underestimated and/or neglected. A rather scarce number of reports and documentations explaining the research data management in certain projects and detailing best practice examples can be found in the current literature.  Additionally, these are often mixed up with topics of the general project management. Furthermore, these examples are very focused on the certain issues of the described projects and thus, a transferability (or general application) of provided methods is very difficult.</p><p>This contribution presents generic concepts of research data management with an effort to separate them from general project management tasks. Project size, details among the diversity of topics and the involved researcher, play an important role in shaping data management and determining which methods of data management can add value to the outcome of a project. We especially focus on different organisation types, including roles and responsibilities for data management in projects of different sizes. Additionally, we show how and when also education should be included, but also how important agreements in a project are.</p>


Sign in / Sign up

Export Citation Format

Share Document