scholarly journals IDENTIFICATION OF REACTOR PHYSICS BENCHMARKS FOR NUCLEAR DATA TESTING: TOOLS AND EXAMPLES

2021 ◽  
Vol 247 ◽  
pp. 10028
Author(s):  
I. Hill

Measurements of reactor physics quantities aimed at identifying the reactivity worth of materials, spectral ratios of cross-sections, and reactivity coefficients have ensured reactor physics codes can accurately predict nuclear reactor systems. These measurements were critical in the absence of sufficiently accurate differential data, and underpinned the need for experiments through the 50s, 60s, 70s and 80s. Data from experimental campaigns were routinely incorporated into nuclear data libraries either through changes to general nuclear data libraries, or more commonly in the local libraries generated by a particular institution or consortium interested in accurately predicting a specific nuclear system (e.g. fast reactors) or parameters (e.g. fission gas release, yields). Over the last three decades, the model has changed. In tandem access to computing power and monte carlo codes rose dramatically. The monte carlo codes were well suited to computing k-eff, and owing to the availability of high quality criticality benchmarks and these benchmarks were increasing used to test the nuclear data. Meanwhile, there was a decline in the production of local libraries as new nuclear systems were not being built, and the existing systems were considered adequately predicted. The cost-to-benefit ratio of validating new libraries relative to their improved prediction capability was less attractive. These trends have continued. It is widely acknowledged that the checking of new nuclear data libraries is highly skewed towards testing against criticality benchmarks, ignoring many of the high quality reactor physics benchmarks during the testing and production of general-purpose nuclear data libraries. However, continued increases in computing power, methodology (GPT), and additional availability reactor physics experiments from sources such as the International Handbook of Evaluated Reactor Physics Experiments should result in better testing of new libraries and ensured applicability to a wide variety of nuclear systems. It often has not. Leveraging the wealth of historical reactor physics measurements represents perhaps the simplest way to improve the quality of nuclear data libraries in the coming decade. Resources at the Nuclear Energy Agency can be utilized to assist in interrogating available identify benchmarks in the reactor physics experiments handbook, and expediting their use in verification and validation. Additionally, high quality experimental campaigns that should be examined in validation will be highlighted to illustrate potential improvements in the verification and validation process.

2018 ◽  
Vol 4 (1) ◽  
pp. 57-63
Author(s):  
Anatoliy Yuferov

The article considers the issues of converting the ENDF format systems of constants to relational databases. This conversion can become one of the tools facilitating the development and operation of factual information, techniques and algorithms in the field of nuclear data and, therefore, increasing the efficiency of the corresponding computational codes. The work briefly examines an infological model of ENDF libraries. The possible structure of tables of the corresponding relational database is described. The proposed database schema and the form of tables take into account the presence of both single and multiple properties of the isotopes under consideration. Consideration is given to the difference in organizational requirements for transferring constants from relational tables to programs and performing a visual analysis of data in tables by a physicist-evaluator. The conversion algorithms and results are described for the ROSFOND-A and ENDF/B-VII.1 libraries. It is shown that performing calculations directly in the DBMS environment has its advantages in terms of simplifying programming and eliminating the need to solve a number of problems on data verification and validation. Possible approaches are indicated to ensure operation of inherited software together with nuclear data libraries in the relational format. Some terminological refinements are proposed to facilitate constructing an infological model for ENDF format. The conversion programs and the ENDF/B-VII.1 library in the relational format are available on a public site.


2021 ◽  
Vol 247 ◽  
pp. 10005
Author(s):  
R. Ichou ◽  
B. Dechenaux

The validation of the VESTA 2.2.0 Monte Carlo depletion code has been initiated using the Spent Fuel Isotopic Composition Database (SFCOMPO). The work presented in this paper is limited to one fuel sample, the GU3 PWR-UOX sample from the ARIANE program, which has a reported burn up of 52.5 MWd.kgHM-1. The chemical analyses of the studied fuel sample were performed by 2 independent laboratories at the end of irradiation and cooling time. US and European evaluated nuclear data libraries, namely ENDF/B-VII.1 and JEFF-3.2, but also the more recent ENDF/B-VIII.0 and JEFF-3.3 are used for the VESTA 2.2.0 calculations. The isotopic concentration results are compared to experimental data and the C/E agreement is analyzed in the light of the previous VESTA 2.1.5 validation results obtained using ENDF/B-VII.0 and JEFF-3.1 nuclear data libraries.


2007 ◽  
Author(s):  
S. Sato ◽  
K. Ochiai ◽  
M. Wada ◽  
M. Yamauchi ◽  
H. Iida ◽  
...  

2019 ◽  
Vol 211 ◽  
pp. 07007
Author(s):  
Henrik Sjöstrand ◽  
Georg Schnabel

Integral experiments can be used to adjust nuclear data libraries. Here a Bayesian Monte Carlo method based on assigning weights to the different random files is used. If the experiments are inconsistent within them-self or with the nuclear data it is shown that the adjustment procedure can lead to undesirable results. Therefore, a technique to treat inconsistent data is presented. The technique is based on the optimization of the marginal likelihood which is approximated by a sample of model calculations. The sources to the inconsistencies are discussed and the importance to consider correlation between the different experiments is emphasized. It is found that the technique can address inconsistencies in a desirable way.


2020 ◽  
Vol 239 ◽  
pp. 19001
Author(s):  
Tim Ware ◽  
David Hanlon ◽  
Glynn Hosking ◽  
Ray Perry ◽  
Simon Richards

The JEFF-3.3 and ENDF/B-VIII.0 evaluated nuclear data libraries were released in December 2017 and February 2018 respectively. Both evaluations represent a comprehensive update to their predecessor evaluations. The ANSWERS Software Service produces the MONK® and MCBEND Monte Carlo codes, and the WIMS deterministic code for nuclear criticality, shielding and reactor physics applications. MONK and MCBEND can utilise continuous energy nuclear data provided by the BINGO nuclear data library and MONK and WIMS can utilise broad energy group data (172 group XMAS scheme) via the WIMS nuclear data library. To produce the BINGO library, the BINGO Pre-Processor code is used to process ENDF-6 format evaluations. This utilises the RECONR-BROADR-PURR sequence of NJOY2016 to reconstruct and Doppler broaden the free gas neutron cross sections together with bespoke routines to generate cumulative distributions for the S(α,β) tabulations and equi-probable bins or probability functions for the secondary angle and energy data. To produce the WIMS library, NJOY2016 is again used to reconstruct and Doppler broaden the cross sections. The THERMR module is used to process the thermal scattering data. Preparation of data for system-dependent resonance shielding of some nuclides is performed. GROUPR is then used to produce the group averaged data before all the data are transformed into the specific WIMS library format. The MONK validation includes analyses based on around 800 configurations for a range of fuel and moderator types. The WIMS validation includes analyses of zero-energy critical and sub-critical, commissioning, operational and post-irradiation experiments for a range of fuel and moderator types. This paper presents and discusses the results of MONK and WIMS validation benchmark calculations using the JEFF-3.3 and ENDF/B-VIII.0 based BINGO and WIMS nuclear data libraries.


2020 ◽  
Author(s):  
A. Kahler ◽  
A. Koning ◽  
C. Jouanne ◽  
D. Rochman ◽  
J. Leppanen ◽  
...  

Author(s):  
Masao Yamanaka

AbstractExcess reactivity and control rod worth are generally considered important reactor physics parameters for experimentally examining the neutron characteristics of criticality in a core, and for maintaining safe operation of the reactor core in terms of neutron multiplication in the core. For excess reactivity and control rod worth at KUCA, as well as at the Fast Critical Assembly in the Japan Atomic Energy Agency, special attention is given to analyzing the uncertainty induced by nuclear data libraries based on experimental data of criticality in representative cores (EE1 and E3 cores). Also, the effect of decreasing uncertainty on the accuracy of criticality is discussed in this study. At KUCA, experimental results are accumulated by measurements of excess reactivity and control rod worth. To evaluate the accuracy of experiments for benchmarks, the uncertainty originated from modeling of the core configuration should be discussed in addition to uncertainty induced by nuclear data, since the uncertainty from modeling has a potential to cover the eigenvalue bias more than uncertainty by nuclear data. Here, to investigate the uncertainty of criticality depending on the neutron spectrum of cores, it is very useful to analyze the reactivity of a large number of measurements in typical hard (EE1) and soft (E3) spectrum cores at KUCA.


Sign in / Sign up

Export Citation Format

Share Document