scholarly journals GeantV

2021 ◽  
Vol 5 (1) ◽  
Author(s):  
G. Amadio ◽  
A. Ananya ◽  
J. Apostolakis ◽  
M. Bandieramonte ◽  
S. Banerjee ◽  
...  

AbstractFull detector simulation was among the largest CPU consumers in all CERN experiment software stacks for the first two runs of the Large Hadron Collider. In the early 2010s, it was projected that simulation demands would scale linearly with increasing luminosity, with only partial compensation from increasing computing resources. The extension of fast simulation approaches to cover more use cases that represent a larger fraction of the simulation budget is only part of the solution, because of intrinsic precision limitations. The remainder corresponds to speeding up the simulation software by several factors, which is not achievable by just applying simple optimizations to the current code base. In this context, the GeantV R&D project was launched, aiming to redesign the legacy particle transport code in order to benefit from features of fine-grained parallelism, including vectorization and increased locality of both instruction and data. This paper provides an extensive presentation of the results and achievements of this R&D project, as well as the conclusions and lessons learned from the beta version prototype.

2019 ◽  
Vol 214 ◽  
pp. 02019
Author(s):  
V. Daniel Elvira

Detector simulation has become fundamental to the success of modern high-energy physics (HEP) experiments. For example, the Geant4-based simulation applications developed by the ATLAS and CMS experiments played a major role for them to produce physics measurements of unprecedented quality and precision with faster turnaround, from data taking to journal submission, than any previous hadron collider experiment. The material presented here contains highlights of a recent review on the impact of detector simulation in particle physics collider experiments published in Ref. [1]. It includes examples of applications to detector design and optimization, software development and testing of computing infrastructure, and modeling of physics objects and their kinematics. The cost and economic impact of simulation in the CMS experiment is also presented. A discussion on future detector simulation needs, challenges and potential solutions to address them is included at the end.


Author(s):  
Nane Kratzke ◽  
Robert Siegfried

Cloud computing can be a game-changer for computationally intensive tasks like simulations. The computational power of Amazon, Google, or Microsoft is even available to a single researcher. However, the pay-as-you-go cost model of cloud computing influences how cloud-native systems are being built. We transfer these insights to the simulation domain. The major contributions of this paper are twofold: (A) we propose a cloud-native simulation stack and (B) derive expectable software engineering trends for cloud-native simulation services. Our insights are based on systematic mapping studies on cloud-native applications, a review of cloud standards, action research activities with cloud engineering practitioners, and corresponding software prototyping activities. Two major trends have dominated cloud computing over the last 10 years. The size of deployment units has been minimized and corresponding architectural styles prefer more fine-grained service decompositions of independently deployable and horizontally scalable services. We forecast similar trends for cloud-native simulation architectures. These similar trends should make cloud-native simulation services more microservice-like, which are composable but just “simulate one thing well.” However, merely transferring existing simulation models to the cloud can result in significantly higher costs. One critical insight of our (and other) research is that cloud-native systems should follow cloud-native architecture principles to leverage the most out of the pay-as-you-go cost model.


Author(s):  
Chris Llewellyn Smith

The Large Hadron Collider (LHC) machine and detectors are now working superbly. There are good reasons to hope and expect that the new domain that the LHC is already exploring, operating at 7 TeV with a luminosity of 10 33  cm −2  s −1 , or the much bigger domain that will be opened up as the luminosity increases to over 10 34 and the energy to 14 TeV, will provide clues that will usher in a new era in particle physics. The arguments that new phenomena will be found in the energy range that will be explored by the LHC have become stronger since they were first seriously analysed in 1984, although their essence has changed little. I will review the evolution of these arguments in a historical context, the development of the LHC project since 1984, and the outlook in the light of reports on the performance of the machine and detectors presented at this meeting.


2011 ◽  
Vol 90-93 ◽  
pp. 3108-3116
Author(s):  
Ben Yan Lu ◽  
Gang Wang

Earthquake codes have been revised and updated in recent years. The issue and implementation of the guidelines for seismic design of bridges have attracted interests and attentions of many researchs at home and abroad. The most important aspect of the code rests on its main approach incorporating “performance-based seismic design”. The main purpose of this study is to investigate the differences caused by the use of guidelines for seismic design of highway bridges and Eurocode8 for bridges in performance criteria, seismic design categories, ground types, response spectrum, earthquake action and detailing of ductile piers. The differences in expressions and some important points for performance criteria, seismic design categories, ground types, response spectrum, earthquake action and detailing of ductile piers by codes are briefly illustrated in tables and figures. Based on the lessons learned from significant earthquakes in the last few years, the existing problems of the current code are pointed, and the trends of future study are discussed.


Author(s):  
CHUNG-HORNG LUNG ◽  
JEFFERY K. COCHRAN ◽  
GERALD T. MACKULAK ◽  
JOSEPH E. URBAN

Software reuse has drawn much attention in computing research. Domain analysis is considered a prerequisite to effective reuse of existing software. Several approaches and methodologies have been proposed for domain analysis or domain modeling, but not many case studies have been reported in the literature. The first objective of this paper is to present the concept and practical experiences of a domain analysis approach in discrete-event simulation in manufacturing — generic /specific modeling. A second objective of this paper is to present a meta-model based on the generic/ specific approach from the software engineering perspective. The steps and knowledge required to build the model are described. Domain analysis lessons learned from the generic/specific approach in discrete-event simulation are discussed. Classification of this domain modeling approach was conducted through the Wartik and Prieto-Diaz criteria. The classification will facilitate the comparison with other domain analysis approaches. Similar modeling concepts or techniques may be beneficial to other researchers in their own application domains.


2010 ◽  
Vol 163-167 ◽  
pp. 4395-4400
Author(s):  
Ben Yan Lu ◽  
Bo Quan Liu ◽  
Ming Liu ◽  
Guo Hua Xing

Earthquake codes have been revised and updated in recent years. The issue and implementation of the guidelines for seismic design of bridges have attracted interests and attentions of many researchs at home and abroad. In this paper, it is compared that the provisions about performance criteria, seismic design categories, response spectrum and earthquake action between guidelines for seismic design of highway bridges and Eurocode 8 for bridges. The main purpose of this study is to investigate the differences caused by the two codes in performance criteria, seismic design categories, response spectrum and earthquake action. The results indicate that it is similar in performance criteria, seismic design categories and response spectrum between guidelines for seismic design of highway bridges and Eurocode 8 for bridges. Based on the lessons learned from significant earthquakes in the last few years, the existing problems of the current code are pointed, and the trends of future study are discussed.


1992 ◽  
Vol 36 (15) ◽  
pp. 1092-1094
Author(s):  
L. A. Whitaker ◽  
W. F. Moroney

This paper describes the process involved in the development of a reaction time test bench for the Computer Aided Systems Human Engineering (CASHE) program, which is based on a strategy for converting human factors information into simulation software, using a test bench metaphor. The metaphor takes its strength from the familiarity systems designers have with test benches and breadboarding facilities currently at their disposal. The purpose of this paper is to provide a description of this software development activity, illustrate the procedure we followed, specify the decision points we encountered, and relate our lessons learned. Our goal was to convey functional specification information to the software developers in a parsimonious, unambiguous, structured manner to facilitate the development of both the software and the user interface, while complying with hardware system constraints. Development of the Reaction Time (RT) Test Benches involved the following tasks: collect and digest the Engineering Data Compendium entries; analyze the variables; determine the scope of the relevant variables to be tested; select the test bench phenomena to be demonstrated; and develop each of the deliverables. These deliverables included the variable range tables, initial variable settings, the control flow and storyboard graphics. We believe that this task is typical of the input human factors specialists can provide to designers in a variety of contexts and hence generalizes beyond this specific application.


2016 ◽  
Vol 116 ◽  
pp. 07005
Author(s):  
Petros Giannakopoulos ◽  
Michail Gkoumas ◽  
Ioannis Diplas ◽  
Georgios Voularinos ◽  
Theofanis Vlachos ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document