scholarly journals Texas A&M US Nuclear DATA Program

2021 ◽  
Vol 252 ◽  
pp. 08001
Author(s):  
Ninel Nica

Nuclear data evaluation is an independent century-long expert activity accompanying the development of the nuclear physics science. Its goal is to produce periodic surveys of the world literature in order to recommend and maintain the set of the best nuclear data parameters of common use in all basic and applied sciences. After WWII the effort extended and while it became more international it continued to be supported mainly by the US for the benefit of the whole world. The Evaluated Nuclear Structure Data File (ENSDF) is the most comprehensive nuclear structure database worldwide maintained by the United States National Nuclear Data Center(NNDC)at Brookhaven National Laboratory(BNL)and echoed by the IAEA Vienna Nuclear Data Services. Part of the US Nuclear Data Program since 2005 the Cyclotron Institute is one of the important contributors to ENSDF. Since 2018 we became an international evaluation center working in a consortium of peers hosted traditionally by prestigious national institutes as well as universities. In this paper the main stages of the evaluation work are presented in order to facilitate a basic understanding of the process as a guide for our potential users. Our goals are to maintain a good productivity vs. quality performance assuring the currency of the data and participating in the effort of modernizing the structure of ENSDF databases in order to make them compatible with the data-centric paradigms of the future.

Author(s):  
J. Xu ◽  
C. Miller ◽  
C. Hofmayer ◽  
H. Graves

Motivated by many design considerations, several conceptual designs for advanced reactors have proposed that the entire reactor building and a significant portion of the steam generator building will be either partially or completely embedded below grade. For the analysis of seismic events, the soil-structure interaction (SSI) effect and passive earth pressure for these types of deeply embedded structures will have a significant influence on the predicted seismic response. Sponsored by the US Nuclear Regulatory Commission (NRC), Brookhaven National Laboratory (BNL) is carrying out a research program to assess the significance of these proposed design features for advanced reactors, and to evaluate the existing analytical methods to determine their applicability and adequacy in capturing the seismic behavior of the proposed designs. This paper summarizes a literature review performed by BNL to determine the state of knowledge and practice for seismic analyses of deeply embedded and/or buried (DEB) nuclear containment type structures. Included in the paper is BNL’s review of the open literature of existing standards, tests, and practices that have been used in the design and analysis of DEB structures. The paper also provides BNL’s evaluation of available codes and guidelines with respect to seismic design practice of DEB structures. Based on BNL’s review, a discussion is provided to highlight the applicability of the existing technologies for seismic analyses of DEB structures and to identify gaps that may exist in knowledge and potential issues that may require better understanding and further research.


2007 ◽  
Author(s):  
A. A. Sonzogni ◽  
T. W. Burrows ◽  
B. Pritychenko ◽  
J. K. Tuli ◽  
D. F. Winchell

2020 ◽  
Vol 239 ◽  
pp. 15004
Author(s):  
Paraskevi Dimitriou ◽  
Shamsuzzoha Basunia ◽  
Lee Bernstein ◽  
Jun Chen ◽  
Zoltán Elekes ◽  
...  

Compilation, evaluation and dissemination of nuclear data are arduous tasks that rely on contributions from experts in both the basic and applied sciences communities whose efforts are coordinated by the International Atomic Energy Agency (IAEA). The Evaluated Nuclear Structure Data File (ENSDF) includes the most extensive and comprehensive set of nuclear structure and decay data evaluations performed by the international network of Nuclear Structure and Decay Data evaluators (NSDD) under the auspices of the IAEA. In this report we describe the recent NSDD activities supported by the IAEA and provide some future perspectives.


Author(s):  
Levente Hajdu ◽  
Jérôme Lauret ◽  
Radomir A. Mihajlović

In this chapter, the authors discuss issues surrounding High Performance Computing (HPC)-driven science on the example of Peta science Monte Carlo experiments conducted at the Brookhaven National Laboratory (BNL), one of the US Department of Energy (DOE) High Energy and Nuclear Physics (HENP) research sites. BNL, hosting the only remaining US-based HENP experiments and apparatus, seem appropriate to study the nature of the High-Throughput Computing (HTC) hungry experiments and short historical development of the HPC technology used in such experiments. The development of parallel processors, multiprocessor systems, custom clusters, supercomputers, networked super systems, and hierarchical parallelisms are presented in an evolutionary manner. Coarse grained, rigid Grid system parallelism is contrasted by cloud computing, which is classified within this chapter as flexible and fine grained soft system parallelism. In the process of evaluating various high performance computing options, a clear distinction between high availability-bound enterprise and high scalability-bound scientific computing is made. This distinction is used to further differentiate cloud from the pre-cloud computing technologies and fit cloud computing better into the scientific HPC.


2008 ◽  
Vol 38 (4) ◽  
pp. 535-568 ◽  
Author(s):  
Robert P. Crease

The Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory was the first facility to move the subfield of nuclear physics into the relativistic (very high-energy) regime. From the time of its formal proposal in 1984 to the start of its operation in 2000, it anchored a profound reconfiguration of Brookhaven's mission. This article analyzes the process by which RHIC came to seem the best solution to a problem thrust upon the Brookhaven laboratory administration by the planning and funding demands of the early 1980s, which required creative reconfiguration of resources and programs from long-established national laboratories accustomed to pursuing particular kinds of science. The RHIC story is an example of "recombinant science," as Catherine Westfall has labeled it, which does not occur as a natural outgrowth of previous research. In the recombinant science that gave birth to RHIC, the ends as well as the means arose as the result of contingencies and convergences that required researchers from multiple subfields to adapt their intentions and methods, sometimes awkwardly. Against a backdrop of limited budgets, increasing oversight, and competitive claims from other labs and projects, this case study illustrates how many strands had to come together simultaneously in RHIC, including changes in theoretical interest, experimental developments, and the existence of hardware assets---plus leadership and several lucky breaks.


2019 ◽  
Vol 223 ◽  
pp. 01028
Author(s):  
F.G. Kondev ◽  
D.J. Hartley ◽  
R. Orford ◽  
J.A Clark ◽  
G. Savard ◽  
...  

Properties of neutron-rich nuclei in the A˜160 region are important for achieving a better understanding of the nuclear structure in this region where little is known owing to diffculties in the production of these nuclei at the present nuclear physics facilities. These properties are essential ingredients in the interpretation of the rareearth peak at A˜160 in the r process abundance distribution, since theoretical models are sensitive to nuclear structure input. Predicated on these ideas, we have initiated a new experimental program at Argonne National Laboratory. During the first experiment, beams from the Californium Rare Isotope Breeder Upgrade radioactive beam facility were used in conjunction with the SATURN decay station and the X-array. We focused initially on several odd-odd nuclei, where β decays of both the ground state and an excited isomer were investigated. Because of the spin difference, a variety of structures in the daughter nuclei were selectively populated and characterized based on their decay properties. Mass measurements using the Canadian Penning Trap aimed at establishing the excitation energy of the β-decaying isomers were also carried out. Evidence was found for a change in the single-particle structure, which in turn results in the formation of a sizable N=98 sub-shell gap at large deformation. Results from the first experimental campaign using the newly-commissioned β-decay station at Gammasphere are also presented.


2019 ◽  
Author(s):  
Mathew Hauer ◽  
James Byars

BACKGROUND: The Internal Revenue Service's (IRS) county-to-county migration data are an incredible resource for understanding migration in the United States. Produced annually since 1990 in conjunction with the US Census Bureau, the IRS migration data represent 95 to 98 percent of the tax-filing universe and their dependents, making the IRS migration data one of the largest sources of migration data. However, any analysis using the IRS migration data must process at least seven legacy formats of these public data across more than 2000 data files -- a serious burden for migration scholars. OBJECTIVE: To produce a single, flat data file containing complete county-to-county IRS migration flow data and to make the computer code to process the migration data freely available. METHODS: This paper uses R to process more than 2,000 IRS migration files into a single, flat data file for use in migration research. CONTRIBUTION: To encourage and facilitate the use of this data, we provide a single, standardized, flat data file containing county-to-county 1-year migration flows for the period 1990-2010 (containing 163,883 dyadic county pairs resulting in 3.2 million county pair-year observations totaling over 343 million migrants) and provide the full R script to download, process, and flatten the IRS migration data.


Sign in / Sign up

Export Citation Format

Share Document