scholarly journals New features and improvements in the NEA nuclear data tool suite

2020 ◽  
Vol 239 ◽  
pp. 19003
Author(s):  
M. Fleming ◽  
I. Hill ◽  
J. Dyrda ◽  
L. Fiorito ◽  
N. Soppera ◽  
...  

The OECD Nuclear Energy Agency (NEA) has developed and maintains several products that are used in the verification and validation of nuclear data, including the Java-based Nuclear Data Information System (JANIS) and the Nuclear Data Sensitivity Tool (NDaST). These integrate other collections of the NEA, including the International Handbooks of benchmark experiments on Criticality Safety and Reactor Physics (ICSBEP and IRPhEP) and their supporting relational databases (DICE and IDAT). Recent development of the JANIS, DICE and NDaST systems have resulted in the ability to perform uncertainty propagation utilising Legendre polynomial sensitivities, calculation of case-to-case covariances and correlations, use of spectrum weighting in perturbations, calculation of statistical results with suites of randomly sampled nuclear data files and new command-line interfaces to automate analyses and generate XML outputs. All of the most recent, major nuclear data libraries have been fully processed and incorporated, along with new visualisation features for covariances and sensitivities, an expanded set of reaction channel definitions, and new EXFOR data types defined by the NRDC. Optimisation of numerical methods has also improved performance, with over order-of-magnitude speed-up in the case of sensitivity-uncertainty calculations.

Author(s):  
Masao Yamanaka

AbstractExcess reactivity and control rod worth are generally considered important reactor physics parameters for experimentally examining the neutron characteristics of criticality in a core, and for maintaining safe operation of the reactor core in terms of neutron multiplication in the core. For excess reactivity and control rod worth at KUCA, as well as at the Fast Critical Assembly in the Japan Atomic Energy Agency, special attention is given to analyzing the uncertainty induced by nuclear data libraries based on experimental data of criticality in representative cores (EE1 and E3 cores). Also, the effect of decreasing uncertainty on the accuracy of criticality is discussed in this study. At KUCA, experimental results are accumulated by measurements of excess reactivity and control rod worth. To evaluate the accuracy of experiments for benchmarks, the uncertainty originated from modeling of the core configuration should be discussed in addition to uncertainty induced by nuclear data, since the uncertainty from modeling has a potential to cover the eigenvalue bias more than uncertainty by nuclear data. Here, to investigate the uncertainty of criticality depending on the neutron spectrum of cores, it is very useful to analyze the reactivity of a large number of measurements in typical hard (EE1) and soft (E3) spectrum cores at KUCA.


2021 ◽  
Vol 247 ◽  
pp. 15007
Author(s):  
Liangzhi Cao ◽  
Zhuojie Sui ◽  
Bo Wang ◽  
Chenghui Wan ◽  
Zhouyu Liu

A method of Covariance-Oriented Sample Transformation (COST) has been proposed in our previous work to provide the converged uncertainty analysis results with a minimal sample size. The transient calculation of nuclear reactor is a key part of the reactor-physics simulation, so the accuracy and confidence of the neutron kinetics results have attracted much attention. In this paper, the Uncertainty Quantification (UQ) function of the high fidelity neutronics code NECP-X has been developed based on our home-developed uncertainty analysis code UNICORN, building a platform for the UQ of the transient calculation. Furthermore, the well-known space-time heterogeneous neutron kinetics benchmark C5G7 and its uncertainty propagation from the nuclear data to the interested key parameters of the core have been investigated. To address the problem of “the curse of dimensionality” caused by the large number of input parameters, the COST method has been applied to generate multivariate normal-distribution samples in uncertainty analysis. As a result, the law of the assembly/pin normalized power and their uncertainty with respect to time after introducing an instantaneous perturbation has been obtained. From the numerical results, it can be observed that the maximum relative uncertainties for the assembly normalized power can up to be about 1.65% and the value for the pin-wise power distributions can be about 2.71%.


2021 ◽  
Vol 247 ◽  
pp. 09007
Author(s):  
Isabelle Duhamel ◽  
Nicolas Leclaire ◽  
Luiz Leal ◽  
Atsushi Kimura ◽  
Shoji Nakamura

Available nuclear data for molybdenum included in the nuclear data libraries are not of sufficient quality for reactor physics or criticality safety issues and indeed information about uncertainties and covariance is either missing or leaves much to be desired. Therefore, IRSN and JAEA performed experimental measurements on molybdenum at the J-PARC (Japan Proton Accelerator Research Complex) facility in Japan. The aim was to measure capture cross section and transmission of natural molybdenum at the ANNRI (Accurate Neutron-Nucleus Reaction measurement Instrument) in the MLF (Material Life and science Facility) of J-PARC. The measurements were performed on metallic natural molybdenum samples with various thicknesses. A NaI detector, placed at a flight-path length of about 28 m, was used for capture measurements and a Li-glass detector (flight-path length of about 28.7 m) for transmission measurements. Following the data reduction process, the measured data are being analyzed and evaluated to produce more accurate cross sections and associated uncertainties.


2018 ◽  
Vol 4 (1) ◽  
pp. 57-63
Author(s):  
Anatoliy Yuferov

The article considers the issues of converting the ENDF format systems of constants to relational databases. This conversion can become one of the tools facilitating the development and operation of factual information, techniques and algorithms in the field of nuclear data and, therefore, increasing the efficiency of the corresponding computational codes. The work briefly examines an infological model of ENDF libraries. The possible structure of tables of the corresponding relational database is described. The proposed database schema and the form of tables take into account the presence of both single and multiple properties of the isotopes under consideration. Consideration is given to the difference in organizational requirements for transferring constants from relational tables to programs and performing a visual analysis of data in tables by a physicist-evaluator. The conversion algorithms and results are described for the ROSFOND-A and ENDF/B-VII.1 libraries. It is shown that performing calculations directly in the DBMS environment has its advantages in terms of simplifying programming and eliminating the need to solve a number of problems on data verification and validation. Possible approaches are indicated to ensure operation of inherited software together with nuclear data libraries in the relational format. Some terminological refinements are proposed to facilitate constructing an infological model for ENDF format. The conversion programs and the ENDF/B-VII.1 library in the relational format are available on a public site.


2021 ◽  
Vol 247 ◽  
pp. 07003
Author(s):  
A. Sargeni ◽  
E. Ivanov

The paper presents our first results of the exercise III-I-2c from the OECD-NEA UAM-LWR benchmark intended to an elaboration of the methodology of uncertainty propagation. The considered case studied a full PWR core behavior in fast (~0.1 sec) rod ejection transient. According to the benchmark, the core represented a Hot Zero Power state. Authors used brute-force sampling propagating nuclear data and thermo-fluid uncertainties using 3D computational IRSN chain HEMERA. It couples the reactor physics code CRONOS and thermal-hydraulic core code FLICA4. The nuclear data uncertainties were represented in a form of cross sections standard deviations (in percentage of the mean cross sections values) supplied by the UAM team. In addition to the original benchmark, the study includes a case with an increased power peak by supplementary rod ejection, i.e. with higher reactivity. Both the results are similar to what we obtained in the mini-core rod ejection: the power standard deviation follows, in percentage of the mean power, the mean power curve. We split the variance with a direct calculation: once the cross sections are modified and the thermal-hydraulics inputs are kept constant, another time the contrary. The results show that uncertainties dues to nuclear data dominate over ones due to the thermal-flow area. Furthermore, the major contributors in peak-of-power variance lie in a fast group of cross sections.


2021 ◽  
Vol 247 ◽  
pp. 06002
Author(s):  
Ben Lindley ◽  
Brendan Tollit ◽  
Peter Smith ◽  
Alan Charles ◽  
Robert Mason ◽  
...  

For liquid metal-cooled fast reactors (LMFRs), improved predictive modelling is desirable to facilitate reactor licensing and operation and move towards a best estimate plus uncertainty (BEPU) approach. A key source of uncertainty in fast reactor calculations arises from the underlying nuclear data. Addressing the propagation of such uncertainties through multiphysics calculations schemes is therefore of importance, and is being addressed through international projects such as the Sodium-cooled Fast Reactor Uncertainty Analysis in Modelling (SFR-UAM) benchmark. In this paper, a methodology for propagation of nuclear data uncertainties within WIMS is presented. Uncertainties on key reactor physics parameters are calculated for selected SFR-UAM benchmark exercises, with good agreement with previous results. A methodology for coupled neutronic-thermal-hydraulic calculations within WIMS is developed, where thermal feedback is introduced to the neutronic solution through coupling with the ARTHUR subchannel code within WIMS and applied to steady-state analysis of the Horizon 2020 ESFR-SMART project reference core. Finally, integration of reactor physics and fuel performance calculations is demonstrated through linking of the WIMS reactor physics code to the TRAFIC fast reactor fuel performance code, through a Fortran-C-Python (FCP) interface. Given the 3D multiphysics calculation methodology, thermal-hydraulic and fuel performance uncertainties can ultimately be sampled alongside the nuclear data uncertainties. Together, these developments are therefore an important step towards enabling propagation of uncertainties through high fidelity, multiphysics SFR calculations and hence facilitate BEPU methodologies.


2001 ◽  
Vol 89 (4-5) ◽  
Author(s):  
O. Schwerer ◽  
P. Obloinský

This paper summarizes the various nuclear data types and libraries available free of charge from the IAEA Nuclear Data Section, many of which are relevant to medical applications. The databases are collected, maintained and made available within the framework of an international nuclear data centers network. Particular emphasis is given to online services via the Internet. The URL address of the IAEA Nuclear Data Services is


2021 ◽  
Vol 247 ◽  
pp. 10028
Author(s):  
I. Hill

Measurements of reactor physics quantities aimed at identifying the reactivity worth of materials, spectral ratios of cross-sections, and reactivity coefficients have ensured reactor physics codes can accurately predict nuclear reactor systems. These measurements were critical in the absence of sufficiently accurate differential data, and underpinned the need for experiments through the 50s, 60s, 70s and 80s. Data from experimental campaigns were routinely incorporated into nuclear data libraries either through changes to general nuclear data libraries, or more commonly in the local libraries generated by a particular institution or consortium interested in accurately predicting a specific nuclear system (e.g. fast reactors) or parameters (e.g. fission gas release, yields). Over the last three decades, the model has changed. In tandem access to computing power and monte carlo codes rose dramatically. The monte carlo codes were well suited to computing k-eff, and owing to the availability of high quality criticality benchmarks and these benchmarks were increasing used to test the nuclear data. Meanwhile, there was a decline in the production of local libraries as new nuclear systems were not being built, and the existing systems were considered adequately predicted. The cost-to-benefit ratio of validating new libraries relative to their improved prediction capability was less attractive. These trends have continued. It is widely acknowledged that the checking of new nuclear data libraries is highly skewed towards testing against criticality benchmarks, ignoring many of the high quality reactor physics benchmarks during the testing and production of general-purpose nuclear data libraries. However, continued increases in computing power, methodology (GPT), and additional availability reactor physics experiments from sources such as the International Handbook of Evaluated Reactor Physics Experiments should result in better testing of new libraries and ensured applicability to a wide variety of nuclear systems. It often has not. Leveraging the wealth of historical reactor physics measurements represents perhaps the simplest way to improve the quality of nuclear data libraries in the coming decade. Resources at the Nuclear Energy Agency can be utilized to assist in interrogating available identify benchmarks in the reactor physics experiments handbook, and expediting their use in verification and validation. Additionally, high quality experimental campaigns that should be examined in validation will be highlighted to illustrate potential improvements in the verification and validation process.


2019 ◽  
Vol 211 ◽  
pp. 07003
Author(s):  
Luca Fiorito ◽  
James Dyrda ◽  
Michael Fleming

Providing reliable estimates of the nuclear data contribution to the uncertainty of well-known integral benchmarks is fundamental to the validation and verification process for a nuclear data library. The Nuclear Energy Agency has produced and maintains the NDaST sensitivity tool, which integrates the DICE sensitivities and nuclear data covariances. This system has been used to rigorously and efficiently provide direct feedback to evaluators and streamline validation. For its future evolution and to identify high-priority development areas, NDaST is continuously compared against state-of-the-art codes that use different uncertainty propagation methodologies. In this work, NDaST was compared to the nuclear data sampling code SANDY for several ICSBEP criticality benchmarks using the JEFF-3.3 evaluated data. Despite excellent overall agreement for cross sections and fission neutron multiplcities, discrepancies due to processed covariance descriptions for angular distributions and prompt fission neutron spectra have identified areas where coordinated development of nuclear data covariance descriptions should be prioritised.


2020 ◽  
Vol 239 ◽  
pp. 19001
Author(s):  
Tim Ware ◽  
David Hanlon ◽  
Glynn Hosking ◽  
Ray Perry ◽  
Simon Richards

The JEFF-3.3 and ENDF/B-VIII.0 evaluated nuclear data libraries were released in December 2017 and February 2018 respectively. Both evaluations represent a comprehensive update to their predecessor evaluations. The ANSWERS Software Service produces the MONK® and MCBEND Monte Carlo codes, and the WIMS deterministic code for nuclear criticality, shielding and reactor physics applications. MONK and MCBEND can utilise continuous energy nuclear data provided by the BINGO nuclear data library and MONK and WIMS can utilise broad energy group data (172 group XMAS scheme) via the WIMS nuclear data library. To produce the BINGO library, the BINGO Pre-Processor code is used to process ENDF-6 format evaluations. This utilises the RECONR-BROADR-PURR sequence of NJOY2016 to reconstruct and Doppler broaden the free gas neutron cross sections together with bespoke routines to generate cumulative distributions for the S(α,β) tabulations and equi-probable bins or probability functions for the secondary angle and energy data. To produce the WIMS library, NJOY2016 is again used to reconstruct and Doppler broaden the cross sections. The THERMR module is used to process the thermal scattering data. Preparation of data for system-dependent resonance shielding of some nuclides is performed. GROUPR is then used to produce the group averaged data before all the data are transformed into the specific WIMS library format. The MONK validation includes analyses based on around 800 configurations for a range of fuel and moderator types. The WIMS validation includes analyses of zero-energy critical and sub-critical, commissioning, operational and post-irradiation experiments for a range of fuel and moderator types. This paper presents and discusses the results of MONK and WIMS validation benchmark calculations using the JEFF-3.3 and ENDF/B-VIII.0 based BINGO and WIMS nuclear data libraries.


Sign in / Sign up

Export Citation Format

Share Document