scholarly journals Yalina-thermal facility neutron characteristic computational study 129I, 237Np and 243Am transmutation reaction rates calculations

2020 ◽  
Vol 239 ◽  
pp. 22013
Author(s):  
Tamara Korbut ◽  
Maksim Kravchenko ◽  
Ivan Edchik ◽  
Sergey Korneev

Present work describes Monte-Carlo calculations of the neutron field and minor actinide transmutation reaction rates within the Yalina-Thermal sub-critical assembly of the Joint Institute for Power and Nuclear Research – Sosny of the National Academy of Sciences of Belarus. The computer model of the facility was prepared for the corresponding calculations via MCU-PD and MCNP Monte-Carlo codes. The model neutron characteristics estimations were performed as well as the nuclear safety analysis. The up-to-date ENDF B/VIII, JEFF 3.3 and JENDL 4.0 nuclear data libraries were used during research.

2021 ◽  
Vol 7 (2) ◽  
Author(s):  
Mikita Sobaleu ◽  
Michal Košťál ◽  
Jan Šimon ◽  
Evžen Losa

Abstract Neutron field shaping is the suitable method for validation of cross section in various energy regions. By increasing the share of neutrons of a certain energy interval and decreasing the share of other, a reaction becomes more sensitive to selected neutrons. As a result, reaction cross section can be validated in selected energy regions more precisely. The shaping can be carried out by both neutron filters which are materials with high absorption in some energy region, or by diffusion material changing the shape of neutron spectra by means of slowing down process. In the presented experiments, the neutron field of the light reactor 0 (LR-0) research reactor was shaped by both using graphite blocks inserted into the core and Cd cladding for increasing the epithermal reaction rate share in total reaction rates. The calculations were carried out with the Monte Carlo N-Particle Transport Code 6 (MCNP6) code and the most recent nuclear data libraries. The results in the pure graphite neutron field are in good agreement; in case of Cd cladding, significant discrepancies were reported. In case of the 23Na(n,γ)24Na reaction, overestimation by about 14% was reached in International Reactor Dosimetry and Fusion File (IRDFF-II), results in other libraries are comparable. In case of 58Fe(n,γ)59Fe, the overestimation as high as 18% is reported in IRDFF-II. For 64Zn(n,γ)65Zn reasonable agreement was reached in evaluated nuclear data file (ENDF/B-VIII), where discrepancies in pure graphite neutron field or in case of Cd cladding are about 10–15%.


2020 ◽  
Vol 239 ◽  
pp. 21003
Author(s):  
Prasoon Raj ◽  
Ulrich Fischer ◽  
Axel Klix ◽  
JET Contributors

The neutron flux-spectrum in a fusion device is frequently determined with activation foils and adjustment of a guess-spectrum in unfolding codes. Spectral-adjustment being a rather complex and uncertain procedure, we are carefully streamlining and evaluating it for upcoming experiments. Input nuclear cross-section data holds a vital position in this. This paper presents a survey of common dosimetry reactions and available data files relevant for fusion applications. While the IRDFF v1.05 library is the recommended source, many reactions of our interest are found missing in this. We investigated other standard sources: ENDF/B-VIII.0, EAF-2010, TENDL-2017, JENDL-4.0 etc. And, we analysed two experiments to ascertain the sensitivity of the spectral adjustment to the choice of nuclear data. One was performed with D-D (approx. 2.5 MeV peak) neutrons at the Joint European Torus (JET) machine and another with a white neutron field (approx. 33 MeV endpoint energy) at Nuclear Physics Institute (NPI) of Řež. Choice of cross-section source has affected the integral fluxes (<5%), reaction rates (<10%), total fluxes in some sensitive energy-regions (>20%) and individual group fluxes (<30%). Based on this experience, essential qualitative conclusions are made to improve the fusion activation-spectrometry.


2021 ◽  
Vol 2021 (2/2021) ◽  
pp. 61-65
Author(s):  
Anguel Demerdjiev ◽  
Dimitar Tonev ◽  
Nikolai Goutev

The Institute for Nuclear Research and Nuclear Energy at the Bulgarian Academy of Sciences is working on the construction of a cyclotron centre. The facility is on a design level. At this stage of the project, an important task is the radiation shielding assessment of the facility. Nowadays, the Monte Carlo transport codes have become the tool of choice for solving this type of problem. In the current paper, the transport code FLUKA is used for the calculations. It is widely applied for shielding design and analysis of accelerators and their components. The distributions of the radiation fields inside and outside the cyclotron bunker are presented in this paper. Both different irradiation scenarios and bunker configurations are considered in the conducted Monte Carlo simulations. These results will be used as a guidance in site planning.


2012 ◽  
Vol 2012 ◽  
pp. 1-6 ◽  
Author(s):  
Ho Jin Park ◽  
Hyung Jin Shim ◽  
Chang Hyo Kim

In the Monte Carlo (MC) burnup analyses, the uncertainty of a tally estimate at a burnup step may be induced from four sources: the statistical uncertainty caused by a finite number of simulations, the nuclear covariance data, uncertainties of number densities, and cross-correlations between the nuclear data and the number densities. In this paper, the uncertainties ofkinf, reaction rates, and number densities for a PWR pin-cell benchmark problem are quantified by an uncertainty propagation formulation in the MC burnup calculations. The required sensitivities of tallied parameters to the microscopic cross-sections and the number densities are estimated by the MC differential operator sampling method accompanied by the fission source perturbation. The uncertainty propagation analyses are conducted with two nuclear covariance data—ENDF/B-VII.1 and SCALE6.1/COVA libraries—and the numerical results are compared with each other.


2021 ◽  
Vol 247 ◽  
pp. 10005
Author(s):  
R. Ichou ◽  
B. Dechenaux

The validation of the VESTA 2.2.0 Monte Carlo depletion code has been initiated using the Spent Fuel Isotopic Composition Database (SFCOMPO). The work presented in this paper is limited to one fuel sample, the GU3 PWR-UOX sample from the ARIANE program, which has a reported burn up of 52.5 MWd.kgHM-1. The chemical analyses of the studied fuel sample were performed by 2 independent laboratories at the end of irradiation and cooling time. US and European evaluated nuclear data libraries, namely ENDF/B-VII.1 and JEFF-3.2, but also the more recent ENDF/B-VIII.0 and JEFF-3.3 are used for the VESTA 2.2.0 calculations. The isotopic concentration results are compared to experimental data and the C/E agreement is analyzed in the light of the previous VESTA 2.1.5 validation results obtained using ENDF/B-VII.0 and JEFF-3.1 nuclear data libraries.


2007 ◽  
Author(s):  
S. Sato ◽  
K. Ochiai ◽  
M. Wada ◽  
M. Yamauchi ◽  
H. Iida ◽  
...  

2021 ◽  
Vol 247 ◽  
pp. 10028
Author(s):  
I. Hill

Measurements of reactor physics quantities aimed at identifying the reactivity worth of materials, spectral ratios of cross-sections, and reactivity coefficients have ensured reactor physics codes can accurately predict nuclear reactor systems. These measurements were critical in the absence of sufficiently accurate differential data, and underpinned the need for experiments through the 50s, 60s, 70s and 80s. Data from experimental campaigns were routinely incorporated into nuclear data libraries either through changes to general nuclear data libraries, or more commonly in the local libraries generated by a particular institution or consortium interested in accurately predicting a specific nuclear system (e.g. fast reactors) or parameters (e.g. fission gas release, yields). Over the last three decades, the model has changed. In tandem access to computing power and monte carlo codes rose dramatically. The monte carlo codes were well suited to computing k-eff, and owing to the availability of high quality criticality benchmarks and these benchmarks were increasing used to test the nuclear data. Meanwhile, there was a decline in the production of local libraries as new nuclear systems were not being built, and the existing systems were considered adequately predicted. The cost-to-benefit ratio of validating new libraries relative to their improved prediction capability was less attractive. These trends have continued. It is widely acknowledged that the checking of new nuclear data libraries is highly skewed towards testing against criticality benchmarks, ignoring many of the high quality reactor physics benchmarks during the testing and production of general-purpose nuclear data libraries. However, continued increases in computing power, methodology (GPT), and additional availability reactor physics experiments from sources such as the International Handbook of Evaluated Reactor Physics Experiments should result in better testing of new libraries and ensured applicability to a wide variety of nuclear systems. It often has not. Leveraging the wealth of historical reactor physics measurements represents perhaps the simplest way to improve the quality of nuclear data libraries in the coming decade. Resources at the Nuclear Energy Agency can be utilized to assist in interrogating available identify benchmarks in the reactor physics experiments handbook, and expediting their use in verification and validation. Additionally, high quality experimental campaigns that should be examined in validation will be highlighted to illustrate potential improvements in the verification and validation process.


Sign in / Sign up

Export Citation Format

Share Document