scholarly journals Manufacturing Data Uncertainties Propagation Method in Burn-Up Problems

2017 ◽  
Vol 2017 ◽  
pp. 1-10
Author(s):  
Thomas Frosio ◽  
Thomas Bonaccorsi ◽  
Patrick Blaise

A nuclear data-based uncertainty propagation methodology is extended to enable propagation of manufacturing/technological data (TD) uncertainties in a burn-up calculation problem, taking into account correlation terms between Boltzmann and Bateman terms. The methodology is applied to reactivity and power distributions in a Material Testing Reactor benchmark. Due to the inherent statistical behavior of manufacturing tolerances, Monte Carlo sampling method is used for determining output perturbations on integral quantities. A global sensitivity analysis (GSA) is performed for each manufacturing parameter and allows identifying and ranking the influential parameters whose tolerances need to be better controlled. We show that the overall impact of some TD uncertainties, such as uranium enrichment, or fuel plate thickness, on the reactivity is negligible because the different core areas induce compensating effects on the global quantity. However, local quantities, such as power distributions, are strongly impacted by TD uncertainty propagations. For isotopic concentrations, no clear trends appear on the results.

2018 ◽  
Vol 4 ◽  
pp. 14 ◽  
Author(s):  
James Dyrda ◽  
Ian Hill ◽  
Luca Fiorito ◽  
Oscar Cabellos ◽  
Nicolas Soppera

Uncertainty propagation to keff using a Total Monte Carlo sampling process is commonly used to solve the issues associated with non-linear dependencies and non-Gaussian nuclear parameter distributions. We suggest that in general, keff sensitivities to nuclear data perturbations are not problematic, and that they remain linear over a large range; the same cannot be said definitively for nuclear data parameters and their impact on final cross-sections and distributions. Instead of running hundreds or thousands of neutronics calculations, we therefore investigate the possibility to take those many cross-section file samples and perform ‘cheap’ sensitivity perturbation calculations. This is efficiently possible with the NEA Nuclear Data Sensitivity Tool (NDaST) and this process we name the half Monte Carlo method (HMM). We demonstrate that this is indeed possible with a test example of JEZEBEL (PMF001) drawn from the ICSBEP handbook, comparing keff directly calculated with SERPENT to those predicted with NDaST. Furthermore, we show that one may retain the normal NDaST benefits; a deeper analysis of the resultant effects in terms of reaction and energy breakdown, without the normal computational burden of Monte Carlo (results within minutes, rather than days). Finally, we assess the rationality of using either full or HMMs, by also using the covariance data to do simple linear 'sandwich formula' type propagation of uncertainty onto the selected benchmarks. This allows us to draw some broad conclusions about the relative merits of selecting a technique with either full, half or zero degree of Monte Carlo simulation


2011 ◽  
Vol 59 (2(3)) ◽  
pp. 1191-1194 ◽  
Author(s):  
D. Rochman ◽  
A. J. Koning ◽  
D. F. Dacruz ◽  
S. C. van der Marck

2019 ◽  
Vol 211 ◽  
pp. 03003 ◽  
Author(s):  
Vincent Lamirand ◽  
Axel Laureau ◽  
Dimitri Rochman ◽  
Gregory Perret ◽  
Adrien Gruel ◽  
...  

The PETALE experimental programme in the CROCUS reactor at EPFL intends to contribute to the validation and improvement of neutron nuclear data in the MeV energy range for stainless steel, particularly in the prospect of heavy reflector elements of pressurized water reactors. It mainly consists of several transmission experiments: first, through metallic sheets of nuclear-grade stainless steel interleaved with dosimeter foils, and, successively, through its elemental components of interest – iron, nickel, and chromium. The present article describes the study for optimizing the response of the dosimetry experiments to the nuclear data of interest.


2012 ◽  
Vol 15 (1) ◽  
pp. 55-70 ◽  
Author(s):  
V. Moya Quiroga ◽  
I. Popescu ◽  
D. P. Solomatine ◽  
L. Bociort

There is an increased awareness of the importance of flood management aimed at preventing human and material losses. A wide variety of numerical modelling tools have been developed in order to make decision-making more efficient, and to better target management actions. Hydroinformatics assumes the holistic integrated approach to managing the information propagating through models, and analysis of uncertainty propagation through models is an important part of such studies. Many popular approaches to uncertainty analysis typically involve various strategies of Monte Carlo sampling of uncertain variables and/or parameters and running a model a large number of times, so that in the case of complex river systems this procedure becomes very time-consuming. In this study the popular modelling systems HEC-HMS, HEC-RAS and Sobek1D2D were applied to modelling the hydraulics of the Timis–Bega basin in Romania. We considered the problem of studying how the flood inundation is influenced by uncertainties in water levels of the reservoirs in the catchment, and uncertainties in the digital elevation model (DEM) used in the 2D hydraulic model. For this we used cloud computing (Amazon Elastic Compute Cloud platform) and cluster computing on the basis of a number of office desktop computers, and were able to show their efficiency, leading to a considerable reduction of the required computer time for uncertainty analysis of complex models. The conducted experiments allowed us to associate probabilities to various areas prone to flooding. This study allows us to draw a conclusion that cloud and cluster computing offer an effective and efficient technology that makes uncertainty-aware modelling a practical possibility even when using complex models.


2014 ◽  
Vol 118 ◽  
pp. 535-537
Author(s):  
J.J. Herrero ◽  
R. Ochoa ◽  
J.S. Martínez ◽  
C.J. Díez ◽  
N. García-Herranz ◽  
...  

2020 ◽  
Vol 239 ◽  
pp. 19003
Author(s):  
M. Fleming ◽  
I. Hill ◽  
J. Dyrda ◽  
L. Fiorito ◽  
N. Soppera ◽  
...  

The OECD Nuclear Energy Agency (NEA) has developed and maintains several products that are used in the verification and validation of nuclear data, including the Java-based Nuclear Data Information System (JANIS) and the Nuclear Data Sensitivity Tool (NDaST). These integrate other collections of the NEA, including the International Handbooks of benchmark experiments on Criticality Safety and Reactor Physics (ICSBEP and IRPhEP) and their supporting relational databases (DICE and IDAT). Recent development of the JANIS, DICE and NDaST systems have resulted in the ability to perform uncertainty propagation utilising Legendre polynomial sensitivities, calculation of case-to-case covariances and correlations, use of spectrum weighting in perturbations, calculation of statistical results with suites of randomly sampled nuclear data files and new command-line interfaces to automate analyses and generate XML outputs. All of the most recent, major nuclear data libraries have been fully processed and incorporated, along with new visualisation features for covariances and sensitivities, an expanded set of reaction channel definitions, and new EXFOR data types defined by the NRDC. Optimisation of numerical methods has also improved performance, with over order-of-magnitude speed-up in the case of sensitivity-uncertainty calculations.


Sign in / Sign up

Export Citation Format

Share Document