scholarly journals Sensitivity and Uncertainty of Criticality

Author(s):  
Masao Yamanaka

AbstractExcess reactivity and control rod worth are generally considered important reactor physics parameters for experimentally examining the neutron characteristics of criticality in a core, and for maintaining safe operation of the reactor core in terms of neutron multiplication in the core. For excess reactivity and control rod worth at KUCA, as well as at the Fast Critical Assembly in the Japan Atomic Energy Agency, special attention is given to analyzing the uncertainty induced by nuclear data libraries based on experimental data of criticality in representative cores (EE1 and E3 cores). Also, the effect of decreasing uncertainty on the accuracy of criticality is discussed in this study. At KUCA, experimental results are accumulated by measurements of excess reactivity and control rod worth. To evaluate the accuracy of experiments for benchmarks, the uncertainty originated from modeling of the core configuration should be discussed in addition to uncertainty induced by nuclear data, since the uncertainty from modeling has a potential to cover the eigenvalue bias more than uncertainty by nuclear data. Here, to investigate the uncertainty of criticality depending on the neutron spectrum of cores, it is very useful to analyze the reactivity of a large number of measurements in typical hard (EE1) and soft (E3) spectrum cores at KUCA.

2019 ◽  
Vol 6 (1) ◽  
Author(s):  
Massimo Sarotto ◽  
Gabriele Firpo ◽  
Anatoly Kochetkov ◽  
Antonin Krása ◽  
Emil Fridman ◽  
...  

Abstract During the EURATOM FP7 project FREYA, a number of experiments were performed in a critical core assembled in the VENUS-F zero-power reactor able to reproduce the ALFRED lead-cooled fast reactor spectrum in a dedicated island. The experiments dealt with the measurements of integral and local neutronic parameters, such as the core criticality, the control rod and the lead void reactivity worth, the axial distributions of fission rates for the nuclides of major interest in a fast spectrum, the spectral indices of important actinides (238U, 239Pu, 237 Np) with respect to 235U. With the main aim to validate the neutronic codes adopted for the ALFRED core design, the VENUS-F core and its characterization measurements were simulated with both deterministic (ERANOS) and stochastic (MCNP, SERPENT) codes, by adopting different nuclear data libraries (JEFF, ENDF/B, JENDL, TENDL). This paper summarizes the main results obtained by highlighting a general agreement between measurements and simulations, with few discrepancies for some parameters that are discussed here. Additionally, a sensitivity and uncertainty analysis was performed with deterministic methods for the core reactivity: it clearly indicates that the small over-criticality estimated by the different codes/libraries resulted to be lower than the uncertainties due to nuclear data.


2020 ◽  
Vol 239 ◽  
pp. 22011 ◽  
Author(s):  
Peng Hong Liem ◽  
Zuhair ◽  
Donny Hartanto

The results of criticality, sensitivity and uncertainty (S\U) analyses on the first core criticality of the Indonesian 30 MWth Multipurpose Reactor RSG GAS (MPR-30) using the recent nuclear data libraries (ENDF/B-VII.1 and JENDL-4.0) and analytical tools available at present (WHISPER-1.1) are presented. Two groups of criticality benchmark cases were carefully selected from the experiments conducted during the first criticality approach and control rod calibrations. The C/E values of effective neutron multiplication factor (k) for the worst case was found around 1.005. Large negative sensitivities were found in (n,e-mail:γ) reaction of H-1, U-235, Al-27, U-238 and Be-9 while large positive sensitivities were found in U-235 (total nu and fission), H-1 (elastic), Be-9 (free gas, elastic) and H-1 S(α,β) (lwtr.20t, inelastic). The S\U analysis results concluded that the uncertainties of k originated from the nuclear data were found around 0.6% which covered well the [C/E-1] values. Differences in the sensitivities amongst the two nuclear data libraries were also identified, and recommendation for improving the nuclear data library was given.


2021 ◽  
Vol 2048 (1) ◽  
pp. 012029
Author(s):  
Suwoto ◽  
H Adrial ◽  
T Setiadipura ◽  
Zuhair ◽  
S Bakhri

Abstract One of the main critical issues on a nuclear reactor is safety and control system. The control rod worth plays an important role in the safety and control of nuclear reactors. The control rods worth calculation is used to specify the safety margin of the reactor. The main objective of this work is to investigate impact of different nuclear data libraries on calculating the control rod reactivity worth on small pebble bed reactor. Calculation of the control rod reactivity worth in small high temperature gas cooled reactor has been conducted using the Monte Carlo N-Particle 6 (MCNP6) code coupled with a different nuclear data library. Famous evaluated nuclear data libraries such as JENDL-40u, ENDF/B-VII.1 and JEFF-3.2 continuous cross section-energy data libraries were used. The overall calculation results of integral control rod worth show that the ENDF/B-VII.1, JENDL-40u and JEFF-3.2 files give values of - 17.814%☐k/k, -18.0204 %☐k/k and -18.0267%☐k/k, respectively. Calculations using ENDF/B-VII.1 give a slightly lower value than the others, while the JENDL-4.0u file gives results that are close to JEFF-3.2 file. The different nuclear data libraries have a relatively small impact on the control rod worth of small pebble bed reactor. Accurate prediction by simulation of control rod worth is very important for the safety operation of all reactor types, especially for new reactor designs.


Author(s):  
Tomáš Czakoj ◽  
Evžen Losa

Three-dimensional Monte Carlo code KENO-VI of SCALE-6.2.2 code system was applied for criticality calculation of the LR-0 reactor core. A central module placed in the center of the core was filled by graphite, lithium fluoride-beryllium fluoride (FLIBE), and lithium fluoride-sodium fluoride (FLINA) compounds. The multiplication factor was obtained for all cases using both ENDF/B-VII.0 and ENDF/B-VII.1 nuclear data libraries. Obtained results were compared with benchmark calculations in the MCNP6 using ENDF/B-VII.0 library. The results of KENO-VI calculations are found to be in good agreement with results obtained by the MCNP6. The discrepancies are typically within tens of pcm excluding the case with the FLINA filling. Sensitivities and uncertainties of the reference case with no filling were determined by a continuos-energy version of the TSUNAMI sequence of SCALE-6.2.2. The obtained uncertainty in multiplication factor due to the uncertainties in nuclear data is about 650 pcm with ENDF/B-VII.1.


2019 ◽  
Vol 211 ◽  
pp. 03004
Author(s):  
Antonín Krása ◽  
Anatoly Kochetkov ◽  
Nadia Messaoudi ◽  
Alexey Stankovskiy ◽  
Guido Vittiglio ◽  
...  

Delayed neutron parameters of fast VENUS-F reactor core configurations are determined with Monte Carlo calculations using various nuclear data libraries. Differences in the calculated effective delayed neutron fraction and the impact of the delayed neutron data (6- or 8-group precursors) that are applied in the experimental data analysis on the measured reactivity effects are studied. Considerable differences are found due to application of 235U and 238U delayed neutron data from JEFF, JENDL and ENDF evaluations.


2020 ◽  
Vol 239 ◽  
pp. 19003
Author(s):  
M. Fleming ◽  
I. Hill ◽  
J. Dyrda ◽  
L. Fiorito ◽  
N. Soppera ◽  
...  

The OECD Nuclear Energy Agency (NEA) has developed and maintains several products that are used in the verification and validation of nuclear data, including the Java-based Nuclear Data Information System (JANIS) and the Nuclear Data Sensitivity Tool (NDaST). These integrate other collections of the NEA, including the International Handbooks of benchmark experiments on Criticality Safety and Reactor Physics (ICSBEP and IRPhEP) and their supporting relational databases (DICE and IDAT). Recent development of the JANIS, DICE and NDaST systems have resulted in the ability to perform uncertainty propagation utilising Legendre polynomial sensitivities, calculation of case-to-case covariances and correlations, use of spectrum weighting in perturbations, calculation of statistical results with suites of randomly sampled nuclear data files and new command-line interfaces to automate analyses and generate XML outputs. All of the most recent, major nuclear data libraries have been fully processed and incorporated, along with new visualisation features for covariances and sensitivities, an expanded set of reaction channel definitions, and new EXFOR data types defined by the NRDC. Optimisation of numerical methods has also improved performance, with over order-of-magnitude speed-up in the case of sensitivity-uncertainty calculations.


Author(s):  
Kazuya Ohgama ◽  
Gerardo Aliberti ◽  
Nicolas E. Stauff ◽  
Shigeo Ohki ◽  
Taek K. Kim

Under the cooperative effort of the Civil Nuclear Energy R&D Working Group within the framework of the U.S.-Japan bilateral, Argonne National Laboratory (ANL) and Japan Atomic Energy Agency (JAEA) have been performing benchmark study using Japan Sodium-cooled Fast Reactor (JSFR) design with metal fuel. In this benchmark study, core characteristic parameters at the beginning of cycle were evaluated by the best estimate deterministic and stochastic methodologies of ANL and JAEA. The results obtained by both institutions are agreed well with less than 200 pcm of discrepancy on the neutron multiplication factor, and less than 3% of discrepancy on the sodium void reactivity, Doppler reactivity, and control rod worth. The results by the stochastic and deterministic were compared in each party to investigate impacts of the deterministic approximation and to understand potential variations in the results due to different calculation methodologies employed. Impacts of the nuclear data libraries were also investigated using a sensitivity analysis methodology.


2015 ◽  
Vol 2015 ◽  
pp. 1-9 ◽  
Author(s):  
A. Rais ◽  
D. Siefman ◽  
G. Girardin ◽  
M. Hursin ◽  
A. Pautz

In order to analyze the steady state and transient behavior of the CROCUS reactor, several methods and models need to be developed in the areas of reactor physics, thermal-hydraulics, and multiphysics coupling. The long-term objectives of this project are to work towards the development of a modern method for the safety analysis of research reactors and to update the Final Safety Analysis Report of the CROCUS reactor. A first part of the paper deals with generation of a core simulator nuclear data library for the CROCUS reactor using the Serpent 2 Monte Carlo code and also with reactor core modeling using the PARCS code. PARCS eigenvalue, radial power distribution, and control rod reactivity worth results were benchmarked against Serpent 2 full-core model results. Using the Serpent 2 model as reference, PARCS eigenvalue predictions were within 240 pcm, radial power was within 3% in the central region of the core, and control rod reactivity worth was within 2%. A second part reviews the current methodology used for the safety analysis of the CROCUS reactor and presents the envisioned approach for the multiphysics modeling of the reactor.


2021 ◽  
Vol 247 ◽  
pp. 10028
Author(s):  
I. Hill

Measurements of reactor physics quantities aimed at identifying the reactivity worth of materials, spectral ratios of cross-sections, and reactivity coefficients have ensured reactor physics codes can accurately predict nuclear reactor systems. These measurements were critical in the absence of sufficiently accurate differential data, and underpinned the need for experiments through the 50s, 60s, 70s and 80s. Data from experimental campaigns were routinely incorporated into nuclear data libraries either through changes to general nuclear data libraries, or more commonly in the local libraries generated by a particular institution or consortium interested in accurately predicting a specific nuclear system (e.g. fast reactors) or parameters (e.g. fission gas release, yields). Over the last three decades, the model has changed. In tandem access to computing power and monte carlo codes rose dramatically. The monte carlo codes were well suited to computing k-eff, and owing to the availability of high quality criticality benchmarks and these benchmarks were increasing used to test the nuclear data. Meanwhile, there was a decline in the production of local libraries as new nuclear systems were not being built, and the existing systems were considered adequately predicted. The cost-to-benefit ratio of validating new libraries relative to their improved prediction capability was less attractive. These trends have continued. It is widely acknowledged that the checking of new nuclear data libraries is highly skewed towards testing against criticality benchmarks, ignoring many of the high quality reactor physics benchmarks during the testing and production of general-purpose nuclear data libraries. However, continued increases in computing power, methodology (GPT), and additional availability reactor physics experiments from sources such as the International Handbook of Evaluated Reactor Physics Experiments should result in better testing of new libraries and ensured applicability to a wide variety of nuclear systems. It often has not. Leveraging the wealth of historical reactor physics measurements represents perhaps the simplest way to improve the quality of nuclear data libraries in the coming decade. Resources at the Nuclear Energy Agency can be utilized to assist in interrogating available identify benchmarks in the reactor physics experiments handbook, and expediting their use in verification and validation. Additionally, high quality experimental campaigns that should be examined in validation will be highlighted to illustrate potential improvements in the verification and validation process.


Author(s):  
E. Temesvari ◽  
B. Batki ◽  
M. Gren

In the ESNII+ EU FP7 project, a reactor physics benchmark aiming at the whole core calculation with the reflectors and detailed description of the structural elements was specified. This benchmark is based on the 2009 CEA concept of the ALLEGRO core. Fixed nominal technological data at nominal reactor state (geometry, composition) were prescribed which had to be modified in specified calculation branches according to different types of the thermal expansion and control rod positions. The parameters of the point kinetic model to be applied in a system thermal hydraulic code had to be determined this way. Static mechanical models of the expansion processes were specified by the benchmark. The goal of the calculation exercise was to verify the reactor physics codes, namely to get information about the modelling uncertainties and — after — their influence on the calculated results of the safety analyses. The obtained deviations between the participants are characterizing the user effects, the modelling uncertainties and the influence of the nuclear data differences all, without the possibility of their separation because of the complexity of the benchmark problem. A conclusion could be drawn that a step by step procedure starting from simple problems (homogenous material, Wigner-Seitz cell or subassembly in asymptotic approach) is necessary if we wish to identify the reasons of the deviations. For the Doppler effect, a decision was made in this direction already in the ESNII+ project where an infinite regular lattice problem without any leakage had to be solved. This approach of the simplicity is followed by the present benchmarks (one rod and one assembly), but extending the simple benchmarks with burnup calculations and taking into account leakage in asymptotic approximation by neglecting the complicated processes necessary in the reflector regions.


Sign in / Sign up

Export Citation Format

Share Document