oak ridge
Recently Published Documents


TOTAL DOCUMENTS

3128
(FIVE YEARS 173)

H-INDEX

47
(FIVE YEARS 3)

Author(s):  
A. Hughes ◽  
D.H. Rood ◽  
D.E. DeVecchio ◽  
A.C. Whittaker ◽  
R.E. Bell ◽  
...  

The quantification of rates for the competing forces of tectonic uplift and erosion has important implications for understanding topographic evolution. Here, we quantify the complex interplay between tectonic uplift, topographic development, and erosion recorded in the hanging walls of several active reverse faults in the Ventura basin, southern California, USA. We use cosmogenic 26Al/10Be isochron burial dating and 10Be surface exposure dating to construct a basin-wide geochronology, which includes burial dating of the Saugus Formation: an important, but poorly dated, regional Quaternary strain marker. Our ages for the top of the exposed Saugus Formation range from 0.36 +0.18/−0.22 Ma to 1.06 +0.23/−0.26 Ma, and our burial ages near the base of shallow marine deposits, which underlie the Saugus Formation, increase eastward from 0.60 +0.05/−0.06 Ma to 3.30 +0.30/−0.41 Ma. Our geochronology is used to calculate rapid long-term reverse fault slip rates of 8.6−12.6 mm yr−1 since ca. 1.0 Ma for the San Cayetano fault and 1.3−3.0 mm yr−1 since ca. 1.0 Ma for the Oak Ridge fault, which are both broadly consistent with contemporary reverse slip rates derived from mechanical models driven by global positioning system (GPS) data. We also calculate terrestrial cosmogenic nuclide (TCN)-derived, catchment-averaged erosion rates that range from 0.05−1.14 mm yr−1 and discuss the applicability of TCN-derived, catchment-averaged erosion rates in rapidly uplifting, landslide-prone landscapes. We compare patterns in erosion rates and tectonic rates to fluvial response times and geomorphic landscape parameters to show that in young, rapidly uplifting mountain belts, catchments may attain a quasi-steady-state on timescales of <105 years even if catchment-averaged erosion rates are still adjusting to tectonic forcing.


Author(s):  
I Wayan Sutrisna Yasa ◽  
Komang Tri Werthi ◽  
I Putu Satwika

Penelitian ini membahas mengenai penerapan metode Analytical Hierarchy Process (AHP) sebagai sistem pendukung keputusan dengan studi kasus menentukan dosen terbaik. Selain itu dalam penelitian ini juga dilakukan perhitungan terhadap nilai konsistensi rasio dengan menggunakan beberapa nilai indeks random yang telah ditemukan oleh para peneliti. Nilai indeks random yang digunakan dalam penelitian ini meliputi nilai indeks random dari Saaty, Noble, Oak Ridge, Golden Wang, Tumala Wan, Aguaron, dan Alonso Lamata. Dalam kasus ini terdapat empat kriteria yang digunakan yaitu Pendidikan, Penelitian, Pengabdian Masyarakat, dan Penunjang. Masing – masing kriteria memiliki bobot awal yaitu Pendidikan 40%, Penelitian 25%, Pengabdian Masyarakat 25% dan Penunjang 10%. Tujuan dari penelitian ini yaitu melakukan perhitungan menggunakan metode AHP dan juga mengukur berapa nilai konsistensi rasio yang dihasilkan dengan menggunakan nilai indeks random dari Saaty, Noble, Oak Ridge, Golden Wang, Tumala Wan, Aguaron, dan Alonso Lamata dalam penentuan dosen terbaik. Penelitian dilakukan di  kampus STMIK Primakara, Jalan Tukad Badung No. 135 Renon, Denpasar Bali. Keywords — Sistem Pendukung Keputusan, Analytical Hierarchy Process( AHP), Dosen Terbaik, STMIK Primakara.


2021 ◽  
Vol 2122 (1) ◽  
pp. 011001

Abstract Thirty three years ago, because of the dramatic increase in the power and utility of computer simulations, The University of Georgia formed the first institutional unit devoted to the application of simulations in research and teaching: The Center for Simulational Physics. Then, as the international simulations community expanded further, we sensed the need for a meeting place for both experienced simulators and newcomers to discuss inventive algorithms and recent results in an environment that promoted lively discussion. As a consequence, the Center for Simulational Physics established an annual workshop series on Recent Developments in Computer Simulation Studies in Condensed Matter Physics. This year’s highly interactive workshop was the 32nd in the series marking our efforts to promote high quality research in simulational physics. The continued interest shown by the scientific community amply demonstrates the useful purpose that these meetings have served. The latest workshop was held at The University of Georgia from February 18-22, 2019. These Proceedings provide a “status report” on a number of important topics. This on-line “volume” is published with the goal of timely dissemination of the material to a wider audience. These Proceedings contain both invited papers and contributed presentations on problems in both classical and quantum condensed matter physics. The Workshop was prefaced by a special tutorial presented by colleagues from Oak Ridgr National Laboratory on a powerful software suite: OWL (Oak Ridge Wang-Landau). The first manuscript in this Proceedings is devoted to this tutorial material. The Workshop topics, as usual, ranged from hard and soft condensed matter to biologically inspired problems and purely methodological advances. We hope that readers will benefit from specialized results as well as profit from exposure to new algorithms, methods of analysis, and conceptual developments. D. P. Landau M. Bachmann S. P. Lewis H.-B. Schüttler


2021 ◽  
Vol 2122 (1) ◽  
pp. 012001
Author(s):  
Ying Wai Li ◽  
Krishna Chaitanya Pitike ◽  
Markus Eisenbach ◽  
Valentino R. Cooper

Abstract The Oak–Ridge Wang–Landau (OWL) package is an open-source scientific software specialized for large-scale, Monte Carlo simulations for the study of materials properties at finite temperature. In this paper, we discuss the main features and capabilities of OWL, followed by detailed descriptions of building and running the code. The readers will be guided through the usage and functionality of the code with a few hands-on examples. This paper is based on a tutorial on OWL given at the 32nd Center for Simulational Physics Workshop on Recent Developments in Computer Simulation Studies in Condensed Matter Physics.


Energies ◽  
2021 ◽  
Vol 14 (21) ◽  
pp. 7092
Author(s):  
Hany Abdel-Khalik ◽  
Dongli Huang ◽  
Ugur Mertyurek ◽  
William Marshall ◽  
William Wieselquist

To establish confidence in the results of computerized physics models, a key regulatory requirement is to develop a scientifically defendable process. The methods employed for confidence, characterization, and consolidation, or C3, are statistically involved and are often accessible only to avid statisticians. This manuscript serves as a pedagogical presentation of the C3 process to all stakeholders—including researchers, industrial practitioners, and regulators—to impart an intuitive understanding of the key concepts and mathematical methods entailed by C3. The primary focus is on calculation of tolerance limits, which is the overall goal of the C3 process. Tolerance limits encode the confidence in the calculation results as communicated to the regulator. Understanding the C3 process is especially critical today, as the nuclear industry is considering more innovative ways to assess new technologies, including new reactor and fuel concepts, via an integrated approach that optimally combines modeling and simulation and minimal targeted validation experiments. This manuscript employs intuitive, analytical, numerical, and visual representations to explain how tolerance limits may be calculated for a wide range of configurations, and it also describes how their values may be interpreted. Various verification tests have been developed to test the calculated tolerance limits and to help delineate their values. The manuscript demonstrates the calculation of tolerance limits for TSURFER, a computer code developed by the Oak Ridge National Laboratory for criticality safety applications. The goal is to evaluate the tolerance limit for TSURFER-determined criticality biases to support the determination of upper, subcritical limits for regulatory purposes.


2021 ◽  
Author(s):  
Chloé Lerin ◽  
Scott Curran ◽  
Melanie Moses-DeBusk ◽  
Adian Cook ◽  
Vicente Boronat Colomer ◽  
...  

Abstract Hybrid electric powertrains are a growing market in medium- and heavy-duty applications. There is a lack of available information to understand the challenges in the integration of engine platforms into electrified powertrains, such as cold-start, restart, and load-reduction effects on emissions and emission control devices. Results from the Heavy Heavy-Duty Diesel Truck (HHDDT) cycle using a conventional medium-duty diesel engine were compared with those of a parallel hybrid architecture. Oak Ridge National Laboratory in collaboration with the US Department of Energy and Odyne Systems, LLC developed a powertrain in a hardware-in-the-loop environment, integrating the Odyne Systems, LLC medium-duty parallel hybrid system, which was used for the hybrid portion of this study. Experiments under the HHDDT cycle showed increasing improvements in fuel consumption and engine-out emissions with the integration of stop/start, hybrid, and hybrid with stop/start. However, the effects of load reduction and exhaust temperature on the thermal management strategy have shown an increase in fueling in the second part of the HHDDT cycle. Four configurations of medium-duty electrification were studied and contributed to building a unique data set containing combustion, emissions, and system integration data. Each electrification level was compared with the conventional baseline. The calibration of the conventional engine was not altered for this study. Opportunities to tailor the combustion process were identified with the stop/start strategy.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Vladimir Sobes ◽  
Briana Hiscox ◽  
Emilian Popov ◽  
Rick Archibald ◽  
Cory Hauck ◽  
...  

AbstractThe authors developed an artificial intelligence (AI)-based algorithm for the design and optimization of a nuclear reactor core based on a flexible geometry and demonstrated a 3× improvement in the selected performance metric: temperature peaking factor. The rapid development of advanced, and specifically, additive manufacturing (3-D printing) and its introduction into advanced nuclear core design through the Transformational Challenge Reactor program have presented the opportunity to explore the arbitrary geometry design of nuclear-heated structures. The primary challenge is that the arbitrary geometry design space is vast and requires the computational evaluation of many candidate designs, and the multiphysics simulation of nuclear systems is very time-intensive. Therefore, the authors developed a machine learning-based multiphysics emulator and evaluated thousands of candidate geometries on Summit, Oak Ridge National Laboratory’s leadership class supercomputer. The results presented in this work demonstrate temperature distribution smoothing in a nuclear reactor core through the manipulation of the geometry, which is traditionally achieved in light water reactors through variable assembly loading in the axial direction and fuel shuffling during refueling in the radial direction. The conclusions discuss the future implications for nuclear systems design with arbitrary geometry and the potential for AI-based autonomous design algorithms.


Chemosphere ◽  
2021 ◽  
Vol 280 ◽  
pp. 130629
Author(s):  
Runwei Li ◽  
Lin Qi ◽  
Victor Ibeanusi ◽  
Veera Badisa ◽  
Scott Brooks ◽  
...  

2021 ◽  
Vol 104 (4) ◽  
pp. 003685042110549
Author(s):  
Henry K. Obeng ◽  
Sylvester A. Birikorang ◽  
Kwame Gyamfi ◽  
Simon Adu ◽  
Andrew Nyamful

The International Atomic Energy Agency defines a nuclear and radiation accident as an occurrence that leads to the release of radiation causing significant consequences to people, the environment, or the facility. During such an event involving a nuclear reactor, the reactor core is a critical component which when damaged, will lead to the release of significant amounts of radionuclides. Assessment of the radiation effect that emanates from reactor accidents is very paramount when it comes to the safety of people and the environment; whether or not the released radiation causes an exposure rate above the recommended threshold nuclear reactor safety. During safety analysis in the nuclear industry, radiological accident analyses are usually carried out based on hypothetical scenarios. Such assessments mostly define the effect associated with the accident and when and how to apply the appropriate safety measures. In this study, a typical radiological assessment was carried out on the Ghana Research Reactor-1. The study considered the available reactor core inventory, released radionuclides, radiation doses and detailed process of achieving all the aforementioned parameters. Oak Ridge isotope generation-2 was used for core inventory calculations and Hotspot 3.01 was also used to model radionuclides dispersion trajectory and calculate the released doses. Some of the radionuclides that were considered include I-131, Sr-90, Cs-137, and Xe-137. Total effective doses equivalent to released radionuclides, the ground deposition activity and the respiratory time-integrated air concentration were estimated. The maximum total effective doses equivalent value of 5.6 × 10−9 Sv was estimated to occur at 0.1 km from the point of release. The maximum ground deposition activity was estimated to be 2.5 × 10−3 kBq/m3 at a distance of 0.1 km from the release point. All the estimated values were found to be far below the annual regulatory limits of 1 mSv for the general public as stated in IAEA BSS GSR part 3.


Sign in / Sign up

Export Citation Format

Share Document