SNUPAR-a nuclear parameter code for nuclear geophysics applications

1989 ◽  
Vol 36 (1) ◽  
pp. 1215-1219 ◽  
Author(s):  
D.C. McKeon ◽  
H.D. Scott
1975 ◽  
Vol 38 (6) ◽  
pp. 557-558
Author(s):  
J. A. Czubek
Keyword(s):  

2018 ◽  
Vol 4 ◽  
pp. 14 ◽  
Author(s):  
James Dyrda ◽  
Ian Hill ◽  
Luca Fiorito ◽  
Oscar Cabellos ◽  
Nicolas Soppera

Uncertainty propagation to keff using a Total Monte Carlo sampling process is commonly used to solve the issues associated with non-linear dependencies and non-Gaussian nuclear parameter distributions. We suggest that in general, keff sensitivities to nuclear data perturbations are not problematic, and that they remain linear over a large range; the same cannot be said definitively for nuclear data parameters and their impact on final cross-sections and distributions. Instead of running hundreds or thousands of neutronics calculations, we therefore investigate the possibility to take those many cross-section file samples and perform ‘cheap’ sensitivity perturbation calculations. This is efficiently possible with the NEA Nuclear Data Sensitivity Tool (NDaST) and this process we name the half Monte Carlo method (HMM). We demonstrate that this is indeed possible with a test example of JEZEBEL (PMF001) drawn from the ICSBEP handbook, comparing keff directly calculated with SERPENT to those predicted with NDaST. Furthermore, we show that one may retain the normal NDaST benefits; a deeper analysis of the resultant effects in terms of reaction and energy breakdown, without the normal computational burden of Monte Carlo (results within minutes, rather than days). Finally, we assess the rationality of using either full or HMMs, by also using the covariance data to do simple linear 'sandwich formula' type propagation of uncertainty onto the selected benchmarks. This allows us to draw some broad conclusions about the relative merits of selecting a technique with either full, half or zero degree of Monte Carlo simulation


1978 ◽  
Author(s):  
Allan B. Tanner ◽  
F.E. Senftle
Keyword(s):  

Author(s):  
V. Lobankov ◽  
L. Akhmetova ◽  
N. Mamontov ◽  
V. Sviatokhin

2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Ivan Tsyfra ◽  
Tomasz Czyżycki

We propose the group-theoretical approach which enables one to generate solutions of equations of mathematical physics in nonhomogeneous media from solutions of the same problem in a homogeneous medium. The efficiency of this method is illustrated with examples of thermal neutron diffusion problems. Such problems appear in neutron physics and nuclear geophysics. The method is also applicable to nonstationary and nonintegrable in quadratures differential equations.


2018 ◽  
Vol 4 ◽  
pp. 20 ◽  
Author(s):  
Massimo Salvatores ◽  
Giuseppe Palmiotti

Nuclear data users’ requirements for uncertainty data started already in the seventies, when several fast reactor projects did use extensively “statistical data adjustments” to meet data improvement for core and shielding design. However, it was only ∼20–30 years later that a major effort started to produce scientifically based covariance data and in particular since ∼2005. Most work has been done since then with spectacular achievements and enhanced understanding both of the uncertainty evaluation process and of the data utilization in V&V. This paper summarizes some key developments and still open challenges.


Sign in / Sign up

Export Citation Format

Share Document