Uncertainty analysis of the prototype FBR Monju with the JENDL-4.0 nuclear data set

2013 ◽  
Vol 51 ◽  
pp. 257-273 ◽  
Author(s):  
P. Tamagno ◽  
W.F.G. van Rooijen
2002 ◽  
Vol 39 (sup2) ◽  
pp. 998-1001 ◽  
Author(s):  
Svetlana Zabrodskaia ◽  
Tatiana Ivanova ◽  
Vladimir Koshcheev ◽  
Gennadi Manturov ◽  
Mark Nikolaev ◽  
...  

2021 ◽  
Vol 19 (10) ◽  
pp. 01-07
Author(s):  
M.H. Asmaa ◽  
Sami A. Habana

Electron thickness and temperature of laser prompted Iron plasma boundaries, among different boundaries, were estimated. Plasma was delivered through the connection of high pinnacle power Nd: YAG laser at the key frequency of 1064 nm with a pellet target contains a limited quantity of lipstick from nearby business sectors. Lines from Fe II at 238.502 nm, Fe II at 254.904 nm, Fe II at 262.370 nm, Fe II at 286.545 nm and Fe I at 349.779 nm were utilized to assess the plasma boundaries. The current investigation was completed to assess electron temperature (Te), electron thickness (ne), plasma recurrence, Debye length and Debye number (ND). Laser-incited breakdown spectroscopy LIBS method was used for examining and deciding ghastly discharge lines. ID of change lines from all spectra was completed by contrasting ghostly lines and NIST nuclear data set.


2021 ◽  
pp. 1-22
Author(s):  
Xu Guo ◽  
Zongliang Du ◽  
Chang Liu ◽  
Shan Tang

Abstract In the present paper, a new uncertainty analysis-based framework for data-driven computational mechanics (DDCM) is established. Compared with its practical classical counterpart, the distinctive feature of this framework is that uncertainty analysis is introduced into the corresponding problem formulation explicitly. Instated of only focusing on a single solution in phase space, a solution set is sought for in order to account for the influence of the multi-source uncertainties associated with the data set on the data-driven solutions. An illustrative example provided shows that the proposed framework is not only conceptually new, but also has the potential of circumventing the intrinsic numerical difficulties pertaining to the classical DDCM framework.


The Auk ◽  
2007 ◽  
Vol 124 (1) ◽  
pp. 71-84 ◽  
Author(s):  
W. Andrew Cox ◽  
Rebecca T. Kimball ◽  
Edward L. Braun

Abstract The evolutionary relationship between the New World quail (Odontophoridae) and other groups of Galliformes has been an area of debate. In particular, the relationship between the New World quail and guineafowl (Numidinae) has been difficult to resolve. We analyzed >8 kb of DNA sequence data from 16 taxa that represent all major lineages of Galliformes to resolve the phylogenetic position of New World quail. A combined data set of eight nuclear loci and three mitochondrial regions analyzed with maximum parsimony, maximum likelihood, and Bayesian methods provide congruent and strong support for New World quail being basal members of a phasianid clade that excludes guineafowl. By contrast, the three mitochondrial regions exhibit modest incongruence with each other. This is reflected in the combined mitochondrial analyses that weakly support the Sibley-Ahlquist topology that placed the New World quail basal in relation to guineafowl and led to the placement of New World quail in its own family, sister to the Phasianidae. However, simulation-based topology tests using the mitochondrial data were unable to reject the topology suggested by our combined (mitochondrial and nuclear) data set. By contrast, similar tests using our most likely topology and our combined nuclear and mitochondrial data allow us to strongly reject the Sibley-Ahlquist topology and a topology based on morphological data that unites Old and New World quail. Posición Filogenética de las Codornices del Nuevo Mundo (Odontophoridae): Ocho Loci Nucleares y Tres Regiones Mitocondriales Contradicen la Morfología y la Filogenia de Sibley y Ahlquist


2019 ◽  
Vol 129 ◽  
pp. 308-315
Author(s):  
Abdulaziz Ahmed ◽  
H. Boukhal ◽  
T. El Bardouni ◽  
M. Makhloul ◽  
E. Chakir ◽  
...  

Author(s):  
Leonid Gutkin ◽  
Suresh Datla ◽  
Christopher Manu

Canadian Nuclear Standard CSA N285.8, “Technical requirements for in-service evaluation of zirconium alloy pressure tubes in CANDU® reactors”(1), permits the use of probabilistic methods when assessments of the reactor core are performed. A non-mandatory annex has been proposed for inclusion in the CSA Standard N285.8 to provide guidelines for performing uncertainty analysis in probabilistic fitness-for-service evaluations within the scope of this Standard, such as the probabilistic evaluation of leak-before-break. The proposed annex outlines the general approach to uncertainty analysis as being comprised of the following major activities: identification of influential variables, characterization of uncertainties in influential variables, and subsequent propagation of these uncertainties through the evaluation framework or code. The proposed methodology distinguishes between two types of non-deterministic variables by the method used to obtain their best estimate. Uncertainties are classified by their source, and different uncertainty components are considered when the best estimates for the variables of interest are obtained using calibrated parametric models or analyses and when these estimates are obtained using statistical models or analyses. The application of the proposed guidelines for uncertainty analysis was exercised by performing a pilot study for one of the evaluations within the scope of the CSA Standard N285.8, the probabilistic evaluation of leak-before-break based on a postulated through-wall crack. The pilot study was performed for a representative CANDU reactor unit using the recently developed software code P-LBB that complies with the requirements of Canadian Nuclear Standard CSA N286.7 for quality assurance of analytical, scientific, and design computer programs for nuclear power plants. This paper discusses the approaches used and the results obtained in the second stage of this pilot study, the uncertainty characterization of influential variables identified as discussed in the companion paper presented at the PVP 2018 Conference (PVP2018-85010). In the proposed methodology, statistical assessment and expert judgment are recognized as two complementary approaches to uncertainty characterization. In this pilot study, the uncertainty characterization was limited to cases where statistical assessment could be used as the primary approach. Parametric uncertainty and uncertainty due to numerical solutions were considered as the uncertainty components for variables represented by parametric models. Residual uncertainty and uncertainty due to imbalances in the model-basis data set were considered as the uncertainty components for variables represented by statistical models. In general, the uncertainty due to numerical solutions was found to be substantially smaller than the parametric uncertainty for variables represented by parametric models, and the uncertainty due to imbalances in the model basis data set was found to be substantially smaller than the residual uncertainty for variables represented by statistical models.


2021 ◽  
Vol 247 ◽  
pp. 09026
Author(s):  
A.G. Nelson ◽  
K.M. Ramey ◽  
F. Heidet

The nuclear data evaluation process inherently yields a nuclear data set designed to produce accurate results for the neutron energy spectra corresponding to a specific benchmark suite of experiments. When studying reactors with spectral conditions outside of, or not well represented by, the experimental database used to evaluate the nuclear data, care should be given to the relevance of the nuclear data used. In such cases, larger biases or uncertainties may be present than in a reactor with well-represented spectra. The motivation of this work is to understand the magnitude of differences between recent nuclear data libraries to provide estimates for expected variability in criticality and power distribution results for sodiumcooled, steel-reflected, metal-fueled fast reactor designs. This work was specifically performed by creating a 3D OpenMC model of a sodium-cooled, steel-reflected, metal-fueled fast reactor similar to the FASTER design but without a thermal test region. This OpenMC model was used to compare the differences in eigenvalues, reactivity coefficients, and the spatial and energetic effects on flux and power distributions between the ENDF/B-VII.0, ENDF/B-VII.1, ENDF/B-VIII.0, JEFF-3.2, and JEFF-3.3 nuclear data libraries. These investigations have revealed that reactivity differences between the above libraries can vary by nearly 900 pcm and the fine-group fluxes can vary by up to 18% in individual groups. Results also show a strong variation in the flux and power distributions near the fuel/reflector interface due to the high variability in the 56Fe cross sections in the libraries examined. This indicates that core design efforts of a sodium-cooled, steel-reflected, metalfueled reactor will require the application of relatively large nuclear data uncertainties and/or the development of a representative benchmark-quality experiment.


2021 ◽  
Vol 247 ◽  
pp. 15007
Author(s):  
Liangzhi Cao ◽  
Zhuojie Sui ◽  
Bo Wang ◽  
Chenghui Wan ◽  
Zhouyu Liu

A method of Covariance-Oriented Sample Transformation (COST) has been proposed in our previous work to provide the converged uncertainty analysis results with a minimal sample size. The transient calculation of nuclear reactor is a key part of the reactor-physics simulation, so the accuracy and confidence of the neutron kinetics results have attracted much attention. In this paper, the Uncertainty Quantification (UQ) function of the high fidelity neutronics code NECP-X has been developed based on our home-developed uncertainty analysis code UNICORN, building a platform for the UQ of the transient calculation. Furthermore, the well-known space-time heterogeneous neutron kinetics benchmark C5G7 and its uncertainty propagation from the nuclear data to the interested key parameters of the core have been investigated. To address the problem of “the curse of dimensionality” caused by the large number of input parameters, the COST method has been applied to generate multivariate normal-distribution samples in uncertainty analysis. As a result, the law of the assembly/pin normalized power and their uncertainty with respect to time after introducing an instantaneous perturbation has been obtained. From the numerical results, it can be observed that the maximum relative uncertainties for the assembly normalized power can up to be about 1.65% and the value for the pin-wise power distributions can be about 2.71%.


Author(s):  
Guanlin Shi ◽  
Yishu Qiu ◽  
Kan Wang

As people pay more attention to nuclear safety analysis, sensitivity and uncertainty analysis has become a research hotspot. In our previous research, we had developed an integrated, built-in stochastic sampling module in the Reactor Monte Carlo code RMC [1]. Using this module, we can perform nuclear data uncertainty analysis. But at that time the uncertainty of fission spectrum was not considered. So, in this work, the capability of computing the uncertainty of keff induced by the uncertainty of fission spectrum, including tabular data form and formula form, is implemented in RMC code based on the stochastic sampling method. The algorithms and capability of computing keff uncertainty induced by uncertainty of fission spectrum in RMC are verified by comparison with the results calculated by the first order uncertainty quantification method [2].


Sign in / Sign up

Export Citation Format

Share Document