Monte Carlo analysis of KRITZ-2 critical benchmarks on the reactivity temperature coefficient using ENDF/B-VII.1 and JENDL-4.0 nuclear data libraries

2016 ◽  
Vol 87 ◽  
pp. 107-118 ◽  
Author(s):  
S. El Ouahdani ◽  
H. Boukhal ◽  
L. Erradi ◽  
E. Chakir ◽  
T. El Bardouni ◽  
...  
2021 ◽  
Vol 247 ◽  
pp. 10005
Author(s):  
R. Ichou ◽  
B. Dechenaux

The validation of the VESTA 2.2.0 Monte Carlo depletion code has been initiated using the Spent Fuel Isotopic Composition Database (SFCOMPO). The work presented in this paper is limited to one fuel sample, the GU3 PWR-UOX sample from the ARIANE program, which has a reported burn up of 52.5 MWd.kgHM-1. The chemical analyses of the studied fuel sample were performed by 2 independent laboratories at the end of irradiation and cooling time. US and European evaluated nuclear data libraries, namely ENDF/B-VII.1 and JEFF-3.2, but also the more recent ENDF/B-VIII.0 and JEFF-3.3 are used for the VESTA 2.2.0 calculations. The isotopic concentration results are compared to experimental data and the C/E agreement is analyzed in the light of the previous VESTA 2.1.5 validation results obtained using ENDF/B-VII.0 and JEFF-3.1 nuclear data libraries.


2007 ◽  
Author(s):  
S. Sato ◽  
K. Ochiai ◽  
M. Wada ◽  
M. Yamauchi ◽  
H. Iida ◽  
...  

2021 ◽  
Vol 247 ◽  
pp. 10028
Author(s):  
I. Hill

Measurements of reactor physics quantities aimed at identifying the reactivity worth of materials, spectral ratios of cross-sections, and reactivity coefficients have ensured reactor physics codes can accurately predict nuclear reactor systems. These measurements were critical in the absence of sufficiently accurate differential data, and underpinned the need for experiments through the 50s, 60s, 70s and 80s. Data from experimental campaigns were routinely incorporated into nuclear data libraries either through changes to general nuclear data libraries, or more commonly in the local libraries generated by a particular institution or consortium interested in accurately predicting a specific nuclear system (e.g. fast reactors) or parameters (e.g. fission gas release, yields). Over the last three decades, the model has changed. In tandem access to computing power and monte carlo codes rose dramatically. The monte carlo codes were well suited to computing k-eff, and owing to the availability of high quality criticality benchmarks and these benchmarks were increasing used to test the nuclear data. Meanwhile, there was a decline in the production of local libraries as new nuclear systems were not being built, and the existing systems were considered adequately predicted. The cost-to-benefit ratio of validating new libraries relative to their improved prediction capability was less attractive. These trends have continued. It is widely acknowledged that the checking of new nuclear data libraries is highly skewed towards testing against criticality benchmarks, ignoring many of the high quality reactor physics benchmarks during the testing and production of general-purpose nuclear data libraries. However, continued increases in computing power, methodology (GPT), and additional availability reactor physics experiments from sources such as the International Handbook of Evaluated Reactor Physics Experiments should result in better testing of new libraries and ensured applicability to a wide variety of nuclear systems. It often has not. Leveraging the wealth of historical reactor physics measurements represents perhaps the simplest way to improve the quality of nuclear data libraries in the coming decade. Resources at the Nuclear Energy Agency can be utilized to assist in interrogating available identify benchmarks in the reactor physics experiments handbook, and expediting their use in verification and validation. Additionally, high quality experimental campaigns that should be examined in validation will be highlighted to illustrate potential improvements in the verification and validation process.


2019 ◽  
Vol 211 ◽  
pp. 07007
Author(s):  
Henrik Sjöstrand ◽  
Georg Schnabel

Integral experiments can be used to adjust nuclear data libraries. Here a Bayesian Monte Carlo method based on assigning weights to the different random files is used. If the experiments are inconsistent within them-self or with the nuclear data it is shown that the adjustment procedure can lead to undesirable results. Therefore, a technique to treat inconsistent data is presented. The technique is based on the optimization of the marginal likelihood which is approximated by a sample of model calculations. The sources to the inconsistencies are discussed and the importance to consider correlation between the different experiments is emphasized. It is found that the technique can address inconsistencies in a desirable way.


Author(s):  
Tomáš Czakoj ◽  
Evžen Losa

Three-dimensional Monte Carlo code KENO-VI of SCALE-6.2.2 code system was applied for criticality calculation of the LR-0 reactor core. A central module placed in the center of the core was filled by graphite, lithium fluoride-beryllium fluoride (FLIBE), and lithium fluoride-sodium fluoride (FLINA) compounds. The multiplication factor was obtained for all cases using both ENDF/B-VII.0 and ENDF/B-VII.1 nuclear data libraries. Obtained results were compared with benchmark calculations in the MCNP6 using ENDF/B-VII.0 library. The results of KENO-VI calculations are found to be in good agreement with results obtained by the MCNP6. The discrepancies are typically within tens of pcm excluding the case with the FLINA filling. Sensitivities and uncertainties of the reference case with no filling were determined by a continuos-energy version of the TSUNAMI sequence of SCALE-6.2.2. The obtained uncertainty in multiplication factor due to the uncertainties in nuclear data is about 650 pcm with ENDF/B-VII.1.


Sign in / Sign up

Export Citation Format

Share Document