scholarly journals JEFF-3.3 covariance application to ICSBEP using SANDY and NDAST

2019 ◽  
Vol 211 ◽  
pp. 07003
Author(s):  
Luca Fiorito ◽  
James Dyrda ◽  
Michael Fleming

Providing reliable estimates of the nuclear data contribution to the uncertainty of well-known integral benchmarks is fundamental to the validation and verification process for a nuclear data library. The Nuclear Energy Agency has produced and maintains the NDaST sensitivity tool, which integrates the DICE sensitivities and nuclear data covariances. This system has been used to rigorously and efficiently provide direct feedback to evaluators and streamline validation. For its future evolution and to identify high-priority development areas, NDaST is continuously compared against state-of-the-art codes that use different uncertainty propagation methodologies. In this work, NDaST was compared to the nuclear data sampling code SANDY for several ICSBEP criticality benchmarks using the JEFF-3.3 evaluated data. Despite excellent overall agreement for cross sections and fission neutron multiplcities, discrepancies due to processed covariance descriptions for angular distributions and prompt fission neutron spectra have identified areas where coordinated development of nuclear data covariance descriptions should be prioritised.

2021 ◽  
Vol 247 ◽  
pp. 15003
Author(s):  
G. Valocchi ◽  
P. Archier ◽  
J. Tommasi

In this paper, we present a sensitivity analysis of the beta effective to nuclear data for the UM17x17 experiment that has been performed in the EOLE reactor. This work is carried out using the APOLLO3® platform. Regarding the flux calculation, the standard two-step approach (lattice/core) is used. For what concerns the delayed nuclear data, they are processed to be directly used in the core calculation without going through the lattice one. We use the JEFF-3.1.1 nuclear data library for cross-sections and delayed data. The calculation of k-effective and beta effective is validated against a TRIPOLI4® one while the main sensitivities are validated against direct calculation. Finally, uncertainty propagation is performed using the COMAC-V2.0 covariance library.


2019 ◽  
Vol 211 ◽  
pp. 07002 ◽  
Author(s):  
Shengli Chen ◽  
David Bernard ◽  
Pascal Archier ◽  
Cyrille De Saint Jean ◽  
Gilles Noguere ◽  
...  

Correlations between neutron inelastic scatterings angular distributions are not included in the Joint Evaluated Fission and Fusion (JEFF) nuclear data library, while they are key quantities for uncertainty propagation of nuclear data. By reproducing the angle-integrated cross sections and uncertainties of JEFF-3.1.1, the present work obtains covariance matrix between high energy model parameters using the least square method implemented in the CONRAD code. With this matrix, it is possible to generate correlations between angle-integrated cross sections and angular distributions, which are usually presented by Legendre coefficients. As expected, strong correlations are found, for example, between the Legendre coefficients of elastic and first-level-inelastic scatterings and the angle-integrated total, elastic, total inelastic cross sections.


2018 ◽  
Vol 4 ◽  
pp. 14 ◽  
Author(s):  
James Dyrda ◽  
Ian Hill ◽  
Luca Fiorito ◽  
Oscar Cabellos ◽  
Nicolas Soppera

Uncertainty propagation to keff using a Total Monte Carlo sampling process is commonly used to solve the issues associated with non-linear dependencies and non-Gaussian nuclear parameter distributions. We suggest that in general, keff sensitivities to nuclear data perturbations are not problematic, and that they remain linear over a large range; the same cannot be said definitively for nuclear data parameters and their impact on final cross-sections and distributions. Instead of running hundreds or thousands of neutronics calculations, we therefore investigate the possibility to take those many cross-section file samples and perform ‘cheap’ sensitivity perturbation calculations. This is efficiently possible with the NEA Nuclear Data Sensitivity Tool (NDaST) and this process we name the half Monte Carlo method (HMM). We demonstrate that this is indeed possible with a test example of JEZEBEL (PMF001) drawn from the ICSBEP handbook, comparing keff directly calculated with SERPENT to those predicted with NDaST. Furthermore, we show that one may retain the normal NDaST benefits; a deeper analysis of the resultant effects in terms of reaction and energy breakdown, without the normal computational burden of Monte Carlo (results within minutes, rather than days). Finally, we assess the rationality of using either full or HMMs, by also using the covariance data to do simple linear 'sandwich formula' type propagation of uncertainty onto the selected benchmarks. This allows us to draw some broad conclusions about the relative merits of selecting a technique with either full, half or zero degree of Monte Carlo simulation


2020 ◽  
Vol 239 ◽  
pp. 09001
Author(s):  
Zhigang Ge ◽  
Ruirui Xu ◽  
Haicheng Wu ◽  
Yue Zhang ◽  
Guochang Chen ◽  
...  

A new version of Chinese Evaluated Nuclear Data Library, namely CENDL-3.2, has been completed under the joint efforts of CENDL working group. This library is constructed with the general purpose to provide high-quality nuclear data for the modern nuclear science and engineering. 272 nuclides from light to heavy are covered in CENDL-3.2 in total and the data for 134 nuclides are new or updated evaluations in energy region of 10-5 eV-20 MeV. The data of most of the key nuclides in nuclear application like U, Pu, Th, Fe et al. have been revised and improved, and various evaluation techniques have been developed to produce the nuclear data with good quality. Moreover, model dependent covariances data for main reaction cross sections are added for 70 fission product nuclides. To assess the accuracy of CENDL-3.2 in application, the data have been tested with the criticality and shielding benchmarks collected in ENDITS-1.0.


Metrology ◽  
2021 ◽  
Vol 2 (1) ◽  
pp. 1-18
Author(s):  
Nikolay V. Kornilov ◽  
Vladimir G. Pronyaev ◽  
Steven M. Grimes

Each experiment provides new information about the value of some physical quantity. However, not only measured values but also the uncertainties assigned to them are an important part of the results. The metrological guides provide recommendations for the presentation of the uncertainties of the measurement results: statistics and systematic components of the uncertainties should be explained, estimated, and presented separately as the results of the measurements. The experimental set-ups, the models of experiments for the derivation of physical values from primary measured quantities, are the product of human activity, making it a rather subjective field. The Systematic Distortion Factor (SDF) may exist in any experiment. It leads to the bias of the measured value from an unknown “true” value. The SDF appears as a real physical effect if it is not removed with additional measurements or analysis. For a set of measured data with the best evaluated true value, their differences beyond their uncertainties can be explained by the presence of Unrecognized Source of Uncertainties (USU) in these data. We can link the presence of USU in the data with the presence of SDF in the results of measurements. The paper demonstrates the existence of SDF in Prompt Fission Neutron Spectra (PFNS) measurements, measurements of fission cross sections, and measurements of Maxwellian spectrum averaged neutron capture cross sections for astrophysical applications. The paper discusses introducing and accounting for the USU in the data evaluation in cases when SDF cannot be eliminated. As an example, the model case of 238U(n,f)/235U(n,f) cross section ratio evaluation is demonstrated.


2020 ◽  
Vol 239 ◽  
pp. 20001
Author(s):  
M. Fleming ◽  
J-C. David ◽  
J.L. Rodríguez-Sánchez ◽  
L. Fiorito ◽  
M. Gilbert ◽  
...  

It is standard practice for nuclear data files to include tabulated data for distinct reaction channels for incident energies up to 20-30 MeV. Above these energies, the assumptions implicit in the definition of individual channels break down and event generators are typically used within codes that simulate nuclear observables in applications. These offer robust simulation of the physics but increase the computational burden. So-called ‘high-energy’ nuclear data files have been produced, but the well-known libraries are more than a decade old and rely upon models developed many years before their release. This presentation describes a modern library with a high level of production automation that offers regular updates as the models it is based upon are improved. The most recent versions of the intra-nuclear cascade and de-excitation models available within Geant4 were used to generate tabulated data of residual nuclide production. For the first released library, the INCL++5.3 and ABLA version within Geant4 v10.3 were used to calculate over 1012 incident protons over 2095 target isotopes with incident energies up to 1 GeV. These were collated into tabulated data in the international-standard ENDF-6 format. The resulting files were provided as group-wise files and were distributed as HEIR-0.1 with the FISPACT-II version 4.0 release. A second library, HEIR-0.2, has been generated using the new INCL++6.0 and C++ translation of the ABLA07 model available within Geant4 v10.4. Simulations were performed using incident protons, neutrons, deuterons and π±. An improved agreement is observed in the comparison to experimental data not only between the two versions, but against the other well-known high-energy nuclear data files and models available within Geant4. This benchmark includes mass and isotopic distributions, as well as incident-energy dependent cumulative and independent cross sections from the EXFOR database.


2020 ◽  
Vol 239 ◽  
pp. 19003
Author(s):  
M. Fleming ◽  
I. Hill ◽  
J. Dyrda ◽  
L. Fiorito ◽  
N. Soppera ◽  
...  

The OECD Nuclear Energy Agency (NEA) has developed and maintains several products that are used in the verification and validation of nuclear data, including the Java-based Nuclear Data Information System (JANIS) and the Nuclear Data Sensitivity Tool (NDaST). These integrate other collections of the NEA, including the International Handbooks of benchmark experiments on Criticality Safety and Reactor Physics (ICSBEP and IRPhEP) and their supporting relational databases (DICE and IDAT). Recent development of the JANIS, DICE and NDaST systems have resulted in the ability to perform uncertainty propagation utilising Legendre polynomial sensitivities, calculation of case-to-case covariances and correlations, use of spectrum weighting in perturbations, calculation of statistical results with suites of randomly sampled nuclear data files and new command-line interfaces to automate analyses and generate XML outputs. All of the most recent, major nuclear data libraries have been fully processed and incorporated, along with new visualisation features for covariances and sensitivities, an expanded set of reaction channel definitions, and new EXFOR data types defined by the NRDC. Optimisation of numerical methods has also improved performance, with over order-of-magnitude speed-up in the case of sensitivity-uncertainty calculations.


2020 ◽  
Vol 29 (08) ◽  
pp. 2050052
Author(s):  
Dashty T. Akrawy ◽  
Ali H. Ahmed ◽  
E. Tel ◽  
A. Aydin ◽  
L. Sihver

An empirical formula to calculate the ([Formula: see text], [Formula: see text] reaction cross-sections for 14.5[Formula: see text]MeV neutrons for 183 target nuclei in the range [Formula: see text] is presented. Evaluated cross-section data from TENDL nuclear data library were used to test and benchmark the formula. In this new formula, the nonelastic cross-section term is replaced by the atomic number [Formula: see text], while the asymmetry parameter-dependent exponential term has been retained. The calculated results are presented in comparison with the seven previously published formulae. We show that the new formula is significantly in better agreement with the measured values compared to previously published formulae.


2020 ◽  
Vol 239 ◽  
pp. 14006
Author(s):  
Tim Ware ◽  
David Hanlon ◽  
Tara Hanlon ◽  
Richard Hiles ◽  
Malcolm Lingard ◽  
...  

Until recently, criticality safety assessment codes had a minimum temperature at which calculations can be performed. Where criticality assessment has been required for lower temperatures, indirect methods, including reasoned argument or extrapolation, have been required to assess reactivity changes associated with these temperatures. The ANSWERS Software Service MONK® version 10B Monte Carlo criticality code, is capable of performing criticality calculations at any temperature, within the temperature limits of the underlying nuclear data in the BINGO continuous energy library. The temperature range of the nuclear data has been extended below the traditional lower limit of 293.6 K to 193 K in a prototype BINGO library, primarily based on JEFF-3.1.2 data. The temperature range of the thermal bound scattering data of the key moderator materials was extended by reprocessing the NJOY LEAPR inputs used to produce bound data for JEFF-3.1.2 and ENDF/B-VIII.0. To give confidence in the low temperature nuclear data, a series of MONK and MCBEND calculations have been performed and results compared against external data sources. MCBEND is a Monte Carlo code for shielding and dosimetry and shares commonalities to its sister code MONK including the BINGO nuclear data library. Good agreement has been achieved between calculated and experimental cross sections for ice, k-effective results for low temperature criticality benchmarks and calculated and experimentally determined eigenvalues for thermal neutron diffusion in ice. To quantify the differences between ice and water bound scattering data a number of MONK criticality calculations were performed for nuclear fuel transport flask configurations. The results obtained demonstrate good agreement with extrapolation methods. There is a discernible difference in the use of ice and water data.


Sign in / Sign up

Export Citation Format

Share Document