scholarly journals Converting ENDF libraries into relational format

2018 ◽  
Vol 4 (1) ◽  
pp. 57-63
Author(s):  
Anatoliy Yuferov

The article considers the issues of converting the ENDF format systems of constants to relational databases. This conversion can become one of the tools facilitating the development and operation of factual information, techniques and algorithms in the field of nuclear data and, therefore, increasing the efficiency of the corresponding computational codes. The work briefly examines an infological model of ENDF libraries. The possible structure of tables of the corresponding relational database is described. The proposed database schema and the form of tables take into account the presence of both single and multiple properties of the isotopes under consideration. Consideration is given to the difference in organizational requirements for transferring constants from relational tables to programs and performing a visual analysis of data in tables by a physicist-evaluator. The conversion algorithms and results are described for the ROSFOND-A and ENDF/B-VII.1 libraries. It is shown that performing calculations directly in the DBMS environment has its advantages in terms of simplifying programming and eliminating the need to solve a number of problems on data verification and validation. Possible approaches are indicated to ensure operation of inherited software together with nuclear data libraries in the relational format. Some terminological refinements are proposed to facilitate constructing an infological model for ENDF format. The conversion programs and the ENDF/B-VII.1 library in the relational format are available on a public site.

2021 ◽  
Vol 247 ◽  
pp. 10028
Author(s):  
I. Hill

Measurements of reactor physics quantities aimed at identifying the reactivity worth of materials, spectral ratios of cross-sections, and reactivity coefficients have ensured reactor physics codes can accurately predict nuclear reactor systems. These measurements were critical in the absence of sufficiently accurate differential data, and underpinned the need for experiments through the 50s, 60s, 70s and 80s. Data from experimental campaigns were routinely incorporated into nuclear data libraries either through changes to general nuclear data libraries, or more commonly in the local libraries generated by a particular institution or consortium interested in accurately predicting a specific nuclear system (e.g. fast reactors) or parameters (e.g. fission gas release, yields). Over the last three decades, the model has changed. In tandem access to computing power and monte carlo codes rose dramatically. The monte carlo codes were well suited to computing k-eff, and owing to the availability of high quality criticality benchmarks and these benchmarks were increasing used to test the nuclear data. Meanwhile, there was a decline in the production of local libraries as new nuclear systems were not being built, and the existing systems were considered adequately predicted. The cost-to-benefit ratio of validating new libraries relative to their improved prediction capability was less attractive. These trends have continued. It is widely acknowledged that the checking of new nuclear data libraries is highly skewed towards testing against criticality benchmarks, ignoring many of the high quality reactor physics benchmarks during the testing and production of general-purpose nuclear data libraries. However, continued increases in computing power, methodology (GPT), and additional availability reactor physics experiments from sources such as the International Handbook of Evaluated Reactor Physics Experiments should result in better testing of new libraries and ensured applicability to a wide variety of nuclear systems. It often has not. Leveraging the wealth of historical reactor physics measurements represents perhaps the simplest way to improve the quality of nuclear data libraries in the coming decade. Resources at the Nuclear Energy Agency can be utilized to assist in interrogating available identify benchmarks in the reactor physics experiments handbook, and expediting their use in verification and validation. Additionally, high quality experimental campaigns that should be examined in validation will be highlighted to illustrate potential improvements in the verification and validation process.


2020 ◽  
Author(s):  
A. Kahler ◽  
A. Koning ◽  
C. Jouanne ◽  
D. Rochman ◽  
J. Leppanen ◽  
...  

Author(s):  
Tomáš Czakoj ◽  
Evžen Losa

Three-dimensional Monte Carlo code KENO-VI of SCALE-6.2.2 code system was applied for criticality calculation of the LR-0 reactor core. A central module placed in the center of the core was filled by graphite, lithium fluoride-beryllium fluoride (FLIBE), and lithium fluoride-sodium fluoride (FLINA) compounds. The multiplication factor was obtained for all cases using both ENDF/B-VII.0 and ENDF/B-VII.1 nuclear data libraries. Obtained results were compared with benchmark calculations in the MCNP6 using ENDF/B-VII.0 library. The results of KENO-VI calculations are found to be in good agreement with results obtained by the MCNP6. The discrepancies are typically within tens of pcm excluding the case with the FLINA filling. Sensitivities and uncertainties of the reference case with no filling were determined by a continuos-energy version of the TSUNAMI sequence of SCALE-6.2.2. The obtained uncertainty in multiplication factor due to the uncertainties in nuclear data is about 650 pcm with ENDF/B-VII.1.


2011 ◽  
Vol 59 (2(3)) ◽  
pp. 1361-1364
Author(s):  
V. Jagannathan ◽  
U. Pal ◽  
R. Karthikeyan ◽  
A. Srivastava ◽  
S. A. Khan

Author(s):  
Jaroslav Zendulka

Modeling techniques play an important role in the development of database applications. Well-known entity-relationship modeling and its extensions have become a widely-accepted approach for relational database conceptual design. An object-oriented approach has brought a new view of conceptual modeling. A class as a fundamental concept of the object-oriented approach encapsulates both data and behavior, whereas traditional relational databases are able to store only data. In the early 1990s, the difference between the relational and object-oriented (OO) technologies, which were, and are still used together to build complex software systems, was labeled the object-relational impedance mismatch (Ambler, 2003). The object-oriented approach and the need of new application areas to store complex data have greatly influenced database technology since that time. Besides appearance of object-oriented database systems, which fully implement objectoriented paradigm in a database environment (Catell et al., 2003), traditional relational database management systems become object-relational (Stonebraker & Brown, 1999). The most recent versions of the SQL standard, SQL: 1999 (Melton & Simon (2001) and SQL: 2003 (Eisenberg et al., 2004), introduced object-relational features to the standard and leading database producers have already released packages which incorporate them.


2009 ◽  
pp. 2360-2383
Author(s):  
Guntis Barzdins ◽  
Janis Barzdins ◽  
Karlis Cerans

This chapter introduces the UML profile for OWL as an essential instrument for bridging the gap between the legacy relational databases and OWL ontologies. We address one of the long-standing relational database design problems where initial conceptual model (a semantically clear domain conceptualization ontology) gets “lost” during conversion into the normalized database schema. The problem is that such “loss” makes database inaccessible for direct query by domain experts familiar with the conceptual model only. This problem can be avoided by exporting the database into RDF according to the original conceptual model (OWL ontology) and formulating semantically clear queries in SPARQL over the RDF database. Through a detailed example we show how UML/OWL profile is facilitating this new and promising approach.


Sign in / Sign up

Export Citation Format

Share Document