A Massively Parallel Reservoir Simulator on the GPU Architecture

2021 ◽  
Author(s):  
Usuf Middya ◽  
Abdulrahman Manea ◽  
Maitham Alhubail ◽  
Todd Ferguson ◽  
Thomas Byer ◽  
...  

Abstract Reservoir simulation computational costs have been continuously growing due to high-resolution reservoir characterization, increasing model complexity, and uncertainty analysis workflows. Reducing simulation costs by upscaling is often necessary for operational requirements. Fast evolving HPC technologies offer opportunities to reduce cost without compromising fidelity. This work presents a novel in-house massively parallel full-physics reservoir simulator running on the emerging GPU architecture. Almost all the simulation kernels have been designed and implemented to honor the GPU SIMD programming paradigm. These kernels include physical property calculations, phase equilibrium computations, Jacobian construction, linear and nonlinear solvers, and wells. Novel techniques are devised in various kernels to expose enough parallelism to ensure that the control and data-flow patterns are well suited for the GPU environment. Mixed-precision computation is also employed when appropriate (e.g., in derivative calculation) to reduce computational costs without compromising the solution accuracy. The GPU implementation of the simulator is tested and benchmarked using various reservoir models, ranging from the synthetic SPE10 Benchmark (Christie & Blunt, 2001) to several industrial-scale models. These real field models range in size from tens of millions of cells to more than billion cells with black-oil and multicomponent compositional fluid. The GPU simulator is benchmarked on the IBM AC922 massively parallel architecture having tens of NVidia Volta V100 GPUs. To compare performance with CPU architectures, an optimized CPU implementation of the simulator is benchmarked on the IBM AC922 CPUs and on a cluster consisting of thousands of Intel's Haswell-EP Xeon® CPU E5-2680 v3. Detailed analysis of several numerical experiments comparing the simulator performance on the GPU and the CPU architectures is presented. In almost all of the cases, the analysis shows that the use of hardware acceleration offers substantial benefits in terms of wall time and power consumption. This novel in-house full-physics, black-oil and compositional reservoir simulator employs several novel techniques in various simulation kernels to ensure full utilization of the GPU resources. Detailed analysis is presented to highlight the simulator performance in terms of runtime reduction, parallel scalability and power savings.

2002 ◽  
Vol 5 (01) ◽  
pp. 11-23 ◽  
Author(s):  
A.H. Dogru ◽  
H.A. Sunaidi ◽  
L.S. Fung ◽  
W.A. Habiballah ◽  
N. Al-Zamel ◽  
...  

Summary A new parallel, black-oil-production reservoir simulator (Powers**) has been developed and fully integrated into the pre- and post-processing graphical environment. Its primary use is to simulate the giant oil and gas reservoirs of the Middle East using millions of cells. The new simulator has been created for parallelism and scalability, with the aim of making megacell simulation a day-to-day reservoir-management tool. Upon its completion, the parallel simulator was validated against published benchmark problems and other industrial simulators. Several giant oil-reservoir studies have been conducted with million-cell descriptions. This paper presents the model formulation, parallel linear solver, parallel locally refined grids, and parallel well management. The benefits of using megacell simulation models are illustrated by a real field example used to confirm bypassed oil zones and obtain a history match in a short time period. With the new technology, preprocessing, construction, running, and post-processing of megacell models is finally practical. A typical history- match run for a field with 30 to 50 years of production takes only a few hours. Introduction With the development of early parallel computers, the attractive speed of these computers got the attention of oil industry researchers. Initial questions were concentrated along these lines:Can one develop a truly parallel reservoir-simulator code?What type of hardware and programming languages should be chosen? Contrary to seismic, it is well known that reservoir simulator algorithms are not naturally parallel; they are more recursive, and variables display a strong dependency on each other (strong coupling and nonlinearity). This poses a big challenge for the parallelization. On the other hand, if one could develop a parallel code, the speed of computations would increase by at least an order of magnitude; as a result, many large problems could be handled. This capability would also aid our understanding of the fluid flow in a complex reservoir. Additionally, the proper handling of the reservoir heterogeneities should result in more realistic predictions. The other benefit of megacell description is the minimization of upscaling effects and numerical dispersion. The megacell simulation has a natural application in simulating the world's giant oil and gas reservoirs. For example, a grid size of 50 m or less is used widely for the small and medium-size reservoirs in the world. In contrast, many giant reservoirs in the Middle East use a gridblock size of 250 m or larger; this easily yields a model with more than 1 million cells. Therefore, it is of specific interest to have megacell description and still be able to run fast. Such capability is important for the day-to-day reservoir management of these fields. This paper is organized as follows: the relevant work in the petroleum-reservoir-simulation literature has been reviewed. This will be followed by the description of the new parallel simulator and the presentation of the numerical solution and parallelism strategies. (The details of the data structures, well handling, and parallel input/output operations are placed in the appendices). The main text also contains a brief description of the parallel linear solver, locally refined grids, and well management. A brief description of megacell pre- and post-processing is presented. Next, we address performance and parallel scalability; this is a key section that demonstrates the degree of parallelization of the simulator. The last section presents four real field simulation examples. These example cases cover all stages of the simulator and provide actual central processing unit (CPU) execution time for each case. As a byproduct, the benefits of megacell simulation are demonstrated by two examples: locating bypassed oil zones, and obtaining a quicker history match. Details of each section can be found in the appendices. Previous Work In the 1980s, research on parallel-reservoir simulation had been intensified by the further development of shared-memory and distributed- memory machines. In 1987, Scott et al.1 presented a Multiple Instruction Multiple Data (MIMD) approach to reservoir simulation. Chien2 investigated parallel processing on sharedmemory computers. In early 1990, Li3 presented a parallelized version of a commercial simulator on a shared-memory Cray computer. For the distributed-memory machines, Wheeler4 developed a black-oil simulator on a hypercube in 1989. In the early 1990s, Killough and Bhogeswara5 presented a compositional simulator on an Intel iPSC/860, and Rutledge et al.6 developed an Implicit Pressure Explicit Saturation (IMPES) black-oil reservoir simulator for the CM-2 machine. They showed that reservoir models over 2 million cells could be run on this type of machine with 65,536 processors. This paper stated that computational speeds in the order of 1 gigaflop in the matrix construction and solution were achievable. In mid-1995, more investigators published reservoir-simulation papers that focused on distributed-memory machines. Kaarstad7 presented a 2D oil/water research simulator running on a 16384 processor MasPar MP-2 machine. He showed that a model problem using 1 million gridpoints could be solved in a few minutes of computer time. Rame and Delshad8 parallelized a chemical flooding code (UTCHEM) and tested it on a variety of systems for scalability. This paper also included test results on Intel iPSC/960, CM-5, Kendall Square, and Cray T3D.


2015 ◽  
Author(s):  
A. Kozlova ◽  
Z. Li ◽  
J.R. Natvig ◽  
S. Watanabe ◽  
Y. Zhou ◽  
...  

SPE Journal ◽  
2016 ◽  
Vol 21 (06) ◽  
pp. 2049-2061 ◽  
Author(s):  
A.. Kozlova ◽  
Z.. Li ◽  
J. R. Natvig ◽  
S.. Watanabe ◽  
Y.. Zhou ◽  
...  

Summary Simulation technology is constantly evolving to take advantage of the best-available computational algorithms and computing hardware. A new technology is being jointly developed by an integrated energy company and a service company to provide a step change to reservoir-simulator performance. Multiscale methods have been rapidly developed during the past few years. Multiscale technology promises to improve simulation run time by an order of magnitude compared with current simulator performance in traditional reservoir-engineering work flows. Following that trend, the two companies have been working in collaboration on a multiscale algorithm that significantly increases performance of reservoir simulators. In this paper, we report the development of multiscale black-oil reservoir-simulation technology in a reservoir simulator used by the industry, as well as the performance and accuracy of the results obtained by use of this implementation. The multiscale method has proved to be accurate and reliable for large real-data models, and the new solver is capable of solving very-large models an order of magnitude faster than the current commercial version of the solver.


2021 ◽  
Author(s):  
Soham Sheth ◽  
Francois McKee ◽  
Kieran Neylon ◽  
Ghazala Fazil

Abstract We present a novel reservoir simulator time-step selection approach which uses machine-learning (ML) techniques to analyze the mathematical and physical state of the system and predict time-step sizes which are large while still being efficient to solve, thus making the simulation faster. An optimal time-step choice avoids wasted non-linear and linear equation set-up work when the time-step is too small and avoids highly non-linear systems that take many iterations to solve. Typical time-step selectors use a limited set of features to heuristically predict the size of the next time-step. While they have been effective for simple simulation models, as model complexity increases, there is an increasing need for robust data-driven time-step selection algorithms. We propose two workflows – static and dynamic – that use a diverse set of physical (e.g., well data) and mathematical (e.g., CFL) features to build a predictive ML model. This can be pre-trained or dynamically trained to generate an inference model. The trained model can also be reinforced as new data becomes available and efficiently used for transfer learning. We present the application of these workflows in a commercial reservoir simulator using distinct types of simulation model including black oil, compositional and thermal steam-assisted gravity drainage (SAGD). We have found that history-match and uncertainty/optimization studies benefit most from the static approach while the dynamic approach produces optimum step-sizes for prediction studies. We use a confidence monitor to manage the ML time-step selector at runtime. If the confidence level falls below a threshold, we switch to traditional heuristic method for that time-step. This avoids any degradation in the performance when the model features are outside the training space. Application to several complex cases, including a large field study, shows a significant speedup for single simulations and even better results for multiple simulations. We demonstrate that any simulation can take advantage of the stored state of the trained model and even augment it when new situations are encountered, so the system becomes more effective as it is exposed to more data.


2020 ◽  
Author(s):  
Jiayi Lai

<p><span>The next generation of weather and climate models will have an unprecedented level of resolution and model complexity, while also increasing the requirements for calculation and memory speed. Reducing the accuracy of certain variables and using mixed precision methods in atmospheric models can greatly improve Computing and memory speed. However, in order to ensure the accuracy of the results, most models have over-designed numerical accuracy, which results in that occupied resources have being much larger than the required resources. Previous studies have shown that the necessary precision for an accurate weather model has clear scale dependence, with large spatial scales requiring higher precision than small scales. Even at large scales the necessary precision is far below that of double precision. However, it is difficult to find a guided method to assign different precisions to different variables, so that it can save unnecessary waste. This paper will take CESM1.2.1 as a research object to conduct a large number of tests to reduce accuracy, and propose a new discrimination method similar to the CFL criterion. This method can realize the correlation verification of a single variable, thereby determining which variables can use a lower level of precision without degrading the accuracy of the results.</span></p>


2014 ◽  
Vol 22 (1) ◽  
pp. 43-62
Author(s):  
Raluca Moldovan

Abstract The present study aims to investigate the contribution that actor Edward G. Robinson brought to the American film industry, beginning with his iconic role as gangster Little Caesar in Mervyn Le Roy’s 1931 production, and continuing with widely-acclaimed parts in classic film noirs such as Double Indemnity, The Woman in the Window and Scarlet Street. Edward G. Robinson was actually a Romanian Jew, born Emmanuel Goldenberg in Bucharest, in 1893, a relatively little known fact nowadays. By examining his biography, filmography and his best-known, most successful films (mentioned above), I show that Edward G. Robinson was one of classical Hollywood’s most influential actors; for instance, traits of his portrayal of Little Caesar (one of the very first American gangster films) can be found in almost all subsequent cinematic gangster figures, from Scarface to Vito Corleone. In the same vein, the doomed noir characters he played in Fritz Lang’s The Woman in the Window and Scarlet Street are still considered by film critics today to be some of the finest, most nuanced examples of noir heroes. Therefore, the main body of my article will be dedicated to a more detailed analysis of these films, while the introductory section will trace his biography and discuss some of his better-known films, such as Confessions of a Nazi Spy and Key Largo. The present study highlights Edward G. Robinson’s merits and impact on the cinema industry, proving that this diminutive Romanian Jew of humble origins was indeed something of a giant during Hollywood’s classical era.


Author(s):  
Anita Theresa Panjaitan ◽  
Rachmat Sudibjo ◽  
Sri Fenny

<p>Y Field which located around 28 km south east of Jakarta was discovered in 1989. Three wells have been drilled and suspended. The initial gas ini place (IGIP) of the field is 40.53 BSCF. The field will be developed in 2011. In this study, reservoir simulation model was made to predict the optimum development strategy of the field. This model consisted of 1,575,064 grid cells which were built in a black oil simulator. Two field development scenarios were defined with and without compressor. Simulation results show that the Recovery Factor at thel end of the contract is 61.40% and 62.14% respectively for Scenarios I and II without compressor. When compressor is applied then Recovey Factor of Scenarios I and II is 68.78% and 74.58%, correspondingly. Based on the economic parameters, Scenario II with compressor is the most <br />attractive case, where IRR, POT, and NPV of the scenario are 41%, 2.9 years, and 14,808 MUS$.</p>


Author(s):  
Erhui Luo ◽  
Yongle Hu ◽  
Jianjun Wang ◽  
Zifei Fan ◽  
Qingying Hou ◽  
...  

The CO2 displacement is one of the gasflooding Enhanced Oil Recovery (EOR) methods. The application from volatile oil to black oil is popular mainly because CO2 requires a relatively low miscibility pressure, which is suitable to most reservoir conditions. However, CO2 always contains some impurity, such as CH4, H2S and N2, leading to the change of phase behavior and flooding efficiency. Whether the gasflooding achieves successfully miscible displacement depends on the reservoir pressure and temperature, injected solvent and crude oil compositions. So three different types of oil samples from the real field are selected and mixtures of CH4, H2S and N2 with various CO2 concentrations as the solvent are considered. After a series of experimental data are excellently matched, three nine-pseudocomponent models are generated based on the thermodynamic Equation-of-State (EoS), which are capable of accurately predicting the complicated phase behavior. Three common tools of pressure–temperature (P–T), pressure–composition (P–X) and pseudoternary diagrams are used to display and analyze the alteration of phase behavior and types of displacement mechanism. Simulation results show that H2S is favorable to attain miscibility while CH4 and N2 are adverse, and the former can reduce the Multiple Contact Miscibility (MCM) pressure by the maximum level of 1.675 MPa per 0.1 mol. In addition, the phase envelope of the mixtures CO2/H2S displacing the reservoir oil on the pseudoternary diagram behaves a triangle shape, indicating the condensing-dominated process. While most phase envelopes of CO2/CH4 and CO2/N2 exhibit the trump and bell shapes, revealing the MCM of vaporization.


2020 ◽  
Vol 53 (2) ◽  
pp. 49-61
Author(s):  
Lisa Peschel

The World War II Jewish ghetto at Theresienstadt, forty miles northwest of Prague, was the site of an uncommonly active cultural life. Survivor testimony about the prisoners’ theatrical performances inspired a question: why were almost all of the scripts written in the ghetto comedies? The recent rediscovery of several scripts has made possible a detailed analysis that draws from recent research on the psychological effects of different types of humour. This analysis reveals that, regardless of age, language or nationality, the Theresienstadt authors universally drew upon two potentially adaptive types of humour (self-enhancing and affiliative humour) rather than two potentially maladaptive types (aggressive and self-defeating humour). Perhaps instinctively, they chose the very types of humour that have a demonstrated association with psychological health and that may have helped them preserve their psychological equilibrium in the potentially traumatising environment of the ghetto.


Sign in / Sign up

Export Citation Format

Share Document