High Performance Optimizations for Nuclear Physics Code MFDn on KNL

Author(s):  
Brandon Cook ◽  
Pieter Maris ◽  
Meiyue Shao ◽  
Nathan Wichmann ◽  
Marcus Wagner ◽  
...  
SIMULATION ◽  
2020 ◽  
Vol 96 (10) ◽  
pp. 791-806
Author(s):  
Milad Yousefi ◽  
Moslem Yousefi ◽  
Flavio S Fogliatto

Since high performance is essential to the functioning of emergency departments (EDs), they must constantly pursue sensible and empirically testable improvements. In light of recent advances in computer science, an increasing number of simulation-based approaches for studying and implementing ED performance optimizations have become available in the literature. This paper aims to offer a survey of these works, presenting progress made on the topic while indicating possible pitfalls and difficulties in EDs. With that in mind, this review considers research studies reporting simulation-based optimization experiments published between 2007 and 2019, covering 38 studies. This paper provides bibliographic background on issues covered, generates statistics on methods and tools applied, and indicates major trends in the field of simulation-based optimization. This review contributes to the state of the art on ED modeling by offering an updated picture of the present state of the field, as well as promising research gaps. In general, this review argues that future studies should focus on increasing the efficiency of multi-objective optimization problems by decreasing their cost in time and labor.


Author(s):  
Levente Hajdu ◽  
Jérôme Lauret ◽  
Radomir A. Mihajlović

In this chapter, the authors discuss issues surrounding High Performance Computing (HPC)-driven science on the example of Peta science Monte Carlo experiments conducted at the Brookhaven National Laboratory (BNL), one of the US Department of Energy (DOE) High Energy and Nuclear Physics (HENP) research sites. BNL, hosting the only remaining US-based HENP experiments and apparatus, seem appropriate to study the nature of the High-Throughput Computing (HTC) hungry experiments and short historical development of the HPC technology used in such experiments. The development of parallel processors, multiprocessor systems, custom clusters, supercomputers, networked super systems, and hierarchical parallelisms are presented in an evolutionary manner. Coarse grained, rigid Grid system parallelism is contrasted by cloud computing, which is classified within this chapter as flexible and fine grained soft system parallelism. In the process of evaluating various high performance computing options, a clear distinction between high availability-bound enterprise and high scalability-bound scientific computing is made. This distinction is used to further differentiate cloud from the pre-cloud computing technologies and fit cloud computing better into the scientific HPC.


Author(s):  
Huilan Liu ◽  
Yushou Song ◽  
Zhaoyang Xie ◽  
Baodong Sun

A low-background gamma spectrometer consists of a high-performance gamma detector and a low-background chamber. It is widely used to monitor the radiation level of the environment and to identify the species of the radiological source. It is especially important for the analysis of the nuclear accident. Usually a high purity Germanium detector (HPGe) is used as a gamma ray detector. In order to enhance the detecting accuracy and sensitivity, it is essential to improve the performance of the gamma detector. In recent years, a clover detector composed of four coaxial HPGe crystals appear and is widely utilized in nuclear physics experimental research. Because of the larger dimensions and segmented structure, it displays outstanding characteristics different from traditional HPGe detectors. With a clover detector as the main detector and the HPLBS1 chamber of ORTEC as the lead chamber, the low-background gamma spectrometer is simulated by the Monte Carlo toolkit GEANT4, where the interaction processes of gamma ray provided by the GEANT4 physics list is used. The detecting performance of the low-background gamma spectrometer such as detecting efficiency and peak-total ratio are given. The results indicate that low-background gamma spectrometer with a clover as the main detector has better characteristic than that with HPGe as a main detector traditionally.


2013 ◽  
Vol 66 (1) ◽  
pp. 431-487 ◽  
Author(s):  
Arslan Munir ◽  
Farinaz Koushanfar ◽  
Ann Gordon-Ross ◽  
Sanjay Ranka

Radiocarbon ◽  
2004 ◽  
Vol 46 (1) ◽  
pp. 97-104 ◽  
Author(s):  
Wolfango Plastino ◽  
Lauri Kaihola

Cosmic background and its variation have been removed in the Gran Sasso National Laboratory (National Institute of Nuclear Physics) by its 1400-m rock overburden. Stable, high-performance liquid scintillation counting conditions are obtained when any remaining variable components of the environmental background, such as radon, are eliminated. The ultra low-level liquid scintillation spectrometer Quantulus™ has an anti-Compton guard detector (guard for short) that allows monitoring of gamma radiation in the background. The guard detector efficiency in radiocarbon background reduction is 8% in the Gran Sasso National Laboratory, while 80% is observed in surface laboratories. Thus, atmospheric pressure variations in surface laboratories cause variation in cosmic radiation flux. The Quantulus anti-Compton detector is highly efficient in detecting cosmic radiation, and the sample count rate remains stable in long-term counting. Also, correlation of sample backgrounds with environmental gamma radiation in various laboratories is examined.


2013 ◽  
Vol 2013 ◽  
pp. 1-14 ◽  
Author(s):  
J. Esposito ◽  
G. Vecchi ◽  
G. Pupillo ◽  
A. Taibi ◽  
L. Uccelli ◽  
...  

Following preliminary feasibility studies which started at Legnaro National Laboratories (LNL) in 2011, the Italian National Institute for Nuclear Physics (INFN) research activities are underway aiming at the alternative, accelerator-driven,Mo99/Tc99mproduction routes. One of the most promising approaches is to use100Mo-enriched (i.e., >99%) molybdenum metallic targets, bombarded with high-beam-current, high-energy proton cyclotrons. In order to get a comprehensive map of radionuclides expected, a detailed theoretical investigation has been carried out using the TALYS-TENDL 2012 excitation functions extended up to (p,6n), (p,p5n), and (p,2p4n) levels. A series of quality parameters have thus been calculated both at the end of beam (EOB) and at longer times. Results point out that accelerator-99Mo is of limited interest for a possible massive production because of the quite low specific activity with respect to reactor-99Mo. Accelerator-Tc99mquality parameters (i.e., radionuclidic purity (RNP), isotopic purity (IP), and specific activities) calculated are instead quite close to the generator-Tc. Calculations at 15, 20, and 25 MeV have thus been performed to assess the best operative irradiation condition forTc99mproduction while minimizing both the short-lived and long-lived Tc contaminant radionuclides. Although present in minimum quantities, Tc contaminants may indeed have an impact either on the pharmaceutical labeling procedures or on contributing to patient radiation dose during the diagnostic procedures.


2020 ◽  
Vol 4 (ICFP) ◽  
pp. 1-29
Author(s):  
Bastian Hagedorn ◽  
Johannes Lenfers ◽  
Thomas Kœhler ◽  
Xueying Qin ◽  
Sergei Gorlatch ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document