GEOSX: A Multiphysics, Multilevel Simulator Designed for Exascale Computing

2021 ◽  
Author(s):  
Herve Gross ◽  
Antoine Mazuyer

Abstract Evaluating large basin-scale formations for CO2 sequestration is one of the most important challenges for our industry. The technical complexity and the quantification of risks associated with these operations call for new reservoir engineering and reservoir simulation tools. The impact of multiple coupled physical phenomena, the century timescale, and basin-sized models in these operations force us to completely take apart and revisit the numerical backbone of existing simulation tools. We need a reservoir simulation tool designed for scalability and portability on high-performance computing architectures. To achieve this, we are proposing a new, open-source, multiphysics, and multilevel physics simulation tool called GEOSX. This tool is jointly created by Lawrence Livermore National Laboratory, Stanford University, and Total. It is designed for scalability on multiple CPUs and multiple GPUs and offers a suite of physical solvers that can be extended easily while achieving a balance between performance and portability. GEOSX is initially targeting multiphysics simulations with coupled geomechanics, flow, and transport mechanics but with its open architecture, it allows access to high-performance physical solvers as building blocks of other multiphysics problems and provides users with a suite of tools for numerical optimization across platforms. In this paper, we introduce GEOSX, expose its fundamental architecture principles, and show an example of geological sequestration of CO2 modeling on real data. We demonstrate our ability to simulate fluid and rock poromechanical interactions over long periods and basin-scale dimensions. GEOSX demonstrates its usefulness for such complex and large problems and proves to be scalable and portable across multiple high-performance systems.

1987 ◽  
Vol 112 ◽  
Author(s):  
Daniel B. Bullen ◽  
Gregory E. Gdowski ◽  
R. Daniel McCright

AbstractThe Nuclear Waste Management Program at Lawrence Livermore National Laboratory is responsible for the development of the waste package design to meet the Nuclear Regulatory Commission licensing requirements for the Nevada Nuclear Waste Storage Investigations (NNWSI) Project. The metallic container component of the waste package is required to assist in providing substantially complete containment of the waste for a period of up to 1000 years. Long term phase stability of the austenitic candidate materials (304L and 316L stainless steels and alloy 825) over this time period at moderate temperatures (100–250°C) can impact the mechanical and corrosion behavior of the metal barrier.A review of the technical literature with respect to phase stability of 304L, 316L and 825 is presented. The impact of martensitic transformations, carbide precipitation and intermediate (σ. χ, and η) phase formation on the mechanical properties and corrosion behavior of these alloys at repository relevant conditions is discussed. The effect of sensitization on intergranular stress corrosion cracking (IGSCC) of each alloy is also addressed. A summary of the impact of phase stability on the degradation of each alloy in the proposed repository environment is included.


2017 ◽  
Vol 27 (03n04) ◽  
pp. 1750006 ◽  
Author(s):  
Farhad Merchant ◽  
Anupam Chattopadhyay ◽  
Soumyendu Raha ◽  
S. K. Nandy ◽  
Ranjani Narayan

Basic Linear Algebra Subprograms (BLAS) and Linear Algebra Package (LAPACK) form basic building blocks for several High Performance Computing (HPC) applications and hence dictate performance of the HPC applications. Performance in such tuned packages is attained through tuning of several algorithmic and architectural parameters such as number of parallel operations in the Directed Acyclic Graph of the BLAS/LAPACK routines, sizes of the memories in the memory hierarchy of the underlying platform, bandwidth of the memory, and structure of the compute resources in the underlying platform. In this paper, we closely investigate the impact of the Floating Point Unit (FPU) micro-architecture for performance tuning of BLAS and LAPACK. We present theoretical analysis for pipeline depth of different floating point operations like multiplier, adder, square root, and divider followed by characterization of BLAS and LAPACK to determine several parameters required in the theoretical framework for deciding optimum pipeline depth of the floating operations. A simple design of a Processing Element (PE) is presented and shown that the PE outperforms the most recent custom realizations of BLAS and LAPACK by 1.1X to 1.5X in GFlops/W, and 1.9X to 2.1X in Gflops/mm2. Compared to multicore, General Purpose Graphics Processing Unit (GPGPU), Field Programmable Gate Array (FPGA), and ClearSpeed CSX700, performance improvement of 1.8-80x is reported in PE.


Author(s):  
John W. Pastrnak

Researchers at Lawrence Livermore National Laboratory are developing a high performance filament wound composite firing vessel intended for containment of one time detonation of explosive assemblies that contain toxic metals and gaseous by-products. A 2-meter diameter pressure vessel is being designed for containment of up to 80 lb tnt equivalent explosive without leakage. Due to the complexity of assuring good o-ring sealing ability for explosive generated dynamic pressures in excess of 40,000 psig (280 MPa), multiple seals in-series are used at the vessel openings. To assess and monitor the integrity of these seals during actual detonations within the vessel; miniature pressure and gas sample measurements were made upon the interstitial volume between the o-ring seals. Recent results of this prototype monitoring system indicated that at least two of the seven o-ring seals were required to adequately prevent transient leakage of toxic particulates from test series CVD-2a as evidenced by mass spectrograph quantities of 10% argon vessel pre-charge as a fiducial indicator gas and later confirmed by particulate swipes for metals.


1990 ◽  
Vol 203 ◽  
Author(s):  
Barry C. Johnson

ABSTRACTHigh Performance Integrated Circuits form the basic building blocks of modern electronic systems that are designed to process ever larger numbers of electrical signals at greater signal velocity and fidelity. In such applications, each circuit must be packaged in order to provide it with necessary mechanical support, environmental protection, electrical interconnection and thermal cooling. The package, however, can also impose certain constraints on the chip. It can degrade electrical performance, add size and weight, introduce reliability problems and increase cost. Thus, packaging can be viewed as a complex balance between the provision of desired functions and the reduction of associated constraints.The ability to strike a proper balance has become increasingly difficult in recent years due to the relentless march of integrated circuits toward higher levels of complexity, size, speed, heat flux and customization. It is anticipated that the continuing evolution of high performance circuits and systems will soon be limited by the package designs and materials-of-construction, rather than by the devices on the semiconductor chip.The intent of this talk is to provide a brief overview of high performance packaging and the related materials issues. The approach is to (a) present the forecasted trends in relevant circuit performance characteristics, (b) discuss the impact of these characteristics on current chip and board level packaging methods, and (c) present new package and materials concepts that might furnish potential solutions to the developing circuit-package performance gap.


2010 ◽  
Vol 132 (2) ◽  
Author(s):  
Roger Schmidt ◽  
Madhusudan Iyengar ◽  
Joe Caricari

With the ever increasing heat dissipated by information technology (IT) equipment housed in data centers, it is becoming more important to project the changes that can occur in the data center as the newer higher powered hardware is installed. The computational fluid dynamics (CFD) software that is available has improved over the years. CFD software specific to data center thermal analysis has also been developed. This has improved the time lines of providing some quick analysis of the effects of new hardware into the data center. But it is critically important that this software provide a good report to the user of the effects of adding this new hardware. It is the purpose of this paper to examine a large cluster installation and compare the CFD analysis with environmental measurements obtained from the same site. This paper shows measurements and CFD data for high powered racks as high as 27 kW clustered such that heat fluxes in some regions of the data center exceeded 700 W per square foot. This paper describes the thermal profile of a high performance computing cluster located in an data center and a comparison of that cluster modeled via CFD. The high performance advanced simulation and computing (ASC) cluster had a peak performance of 77.8 TFlop/s, and employed more than 12,000 processors, 50 Tbytes of memory, and 2 Pbytes of globally accessible disk space. The cluster was first tested in the manufacturer’s development laboratory in Poughkeepsie, New York, and then shipped to Lawrence Livermore National Laboratory in Livermore, California, where it was installed to support the national security mission of the U.S. Detailed measurements were taken in both data centers and were previously reported. The Poughkeepsie results will be reported here along with a comparison to CFD modeling results. In some areas of the Poughkeepsie data center, there were regions that did exceed the equipment inlet air temperature specifications by a significant amount. These areas will be highlighted and reasons given on why these areas failed to meet the criteria. The modeling results by region showed trends that compared somewhat favorably but some rack thermal profiles deviated quite significantly from measurements.


2008 ◽  
Vol 86 (1) ◽  
pp. 231-240 ◽  
Author(s):  
F S Porter ◽  
B R Beck ◽  
P Beiersdorfer ◽  
K R Boyce ◽  
G V Brown ◽  
...  

NASA’s X-ray spectrometer (XRS) microcalorimeter instrument has been operating at the electron beam ion trap (EBIT) facility at Lawrence Livermore National Laboratory since July of 2000. The spectrometer is currently undergoing its third major upgrade to become an easy to use and extremely high-performance instrument for a broad range of EBIT experiments. The spectrometer itself is broadband, capable of simultaneously operating from 0.1 to 12 keV and has been operated at up to 100 keV by manipulating its operating conditions. The spectral resolution closely follows the spaceflight version of the XRS, beginning at 10 eV FWHM at 6 keV in 2000, upgraded to 5.5 eV in 2003, and will hopefully be ~3.8 eV in the fall of 2007. Here we review the operating principles of this unique instrument, the extraordinary science that has been performed at EBIT over the last six years, and prospects for future upgrades. Specifically, we discuss upgrades to cover the high-energy band (to at least 100 keV) with a high quantum efficiency detector and prospects for using a new superconducting detector to reach 0.8 eV resolution at 1 keV and 2 eV at 6 keV with high counting rates. PACS Nos.: 52.25.Os, 52.70.La, 95.85.Nv, 32.30.Rj, 07.85.Fv, 78.70.En


2007 ◽  
Vol 17 (03) ◽  
pp. 485-494 ◽  
Author(s):  
SHAIKH AHMED ◽  
GERHARD KLIMECK ◽  
DERRICK KEARNEY ◽  
MICHAEL MCLENNAN ◽  
M. P. ANANTRAM

Undesirable short-channel effects associated with the relentless downscaling of conventional CMOS devices have led to the emergence of new classes of MOSFETs. This has led to new and unprecedented challenges in computational nanoelectronics. The device sizes have already reached the level of tens of nanometers where quantum nature of charge-carriers dominates the device operation and performance. The goal of this paper is to describe an on-going initiative on nanoHUB.org to provide new models, algorithms, approaches, and a comprehensive suite of freely-available web-based simulation tools for nanoscale devices with capabilities not yet available commercially. Three software packages nanoFET, nanoMOS and QuaMC are benchmarked in the simulation of a widely-studied high-performance novel MOSFET device. The impact of quantum mechanical effects on the device properties is elucidated and key design issues are suggested.


1997 ◽  
Vol 77 (03) ◽  
pp. 504-509 ◽  
Author(s):  
Sarah L Booth ◽  
Jacqueline M Charnley ◽  
James A Sadowski ◽  
Edward Saltzman ◽  
Edwin G Bovill ◽  
...  

SummaryCase reports cited in Medline or Biological Abstracts (1966-1996) were reviewed to evaluate the impact of vitamin K1 dietary intake on the stability of anticoagulant control in patients using coumarin derivatives. Reported nutrient-drug interactions cannot always be explained by the vitamin K1 content of the food items. However, metabolic data indicate that a consistent dietary intake of vitamin K is important to attain a daily equilibrium in vitamin K status. We report a diet that provides a stable intake of vitamin K1, equivalent to the current U.S. Recommended Dietary Allowance, using food composition data derived from high-performance liquid chromatography. Inconsistencies in the published literature indicate that prospective clinical studies should be undertaken to clarify the putative dietary vitamin K1-coumarin interaction. The dietary guidelines reported here may be used in such studies.


Sign in / Sign up

Export Citation Format

Share Document