Aspects of Multiscale Flow Simulation with Potential to Enhance Reservoir Engineering Practice

SPE Journal ◽  
2021 ◽  
pp. 1-19
Author(s):  
Sanjoy Kumar Khataniar ◽  
Daniel de Brito Dias ◽  
Rong Xu

Summary A multiscale sequential fully implicit (MS SFI) reservoir simulation method implemented in a commercial simulator is applied to a set of reservoir engineering problems to understand its potential. Our assessment highlights workflows where the approach brings substantial performance advantages and insight generation. The understanding gained during commercialization on approximately 40 real-world models is illustrated through simpler but representative data sets, available in the public domain. The main characteristics of the method and key features of the implementation are briefly discussed. The robust fully implicit (FI) simulation method is used as a benchmark. The implementation of the MS SFI method is found to faithfully reproduce FI results for black-oil problems. We provide evidence and analysis of why the MS SFI approach can achieve high levels of performance and fidelity. The method supports the solution of unique problems that would benefit from incorporating multiscale geology and multiscale flow physics. The MS SFI implementation was used to successfully simulate a typical sector model used for field pilots at extremely high “whole core” scale resolution within a practical time frame leveraging high-performance computing (HPC). This could not be achieved with the FI approach. A combination of MS SFI and HPC offers immense potential to simulate geological models using grids to capture mesoscopic or laminar scale geology. The method, by design, demands fewer computing resources than FI, making it far more cost-effective to use for such high-resolution models. We conclude that the MS SFI method has a distinct capability to enhance reservoir engineering practice in the areas of high-resolutionsimulation-driven workflows in context of subsurface uncertainty quantification, field development planning, and reservoir performance optimization. NOTE: This paper is published as part of the 2021 SPE Reservoir Simulation Conference Special Issue.

2021 ◽  
Author(s):  
Sanjoy Kumar Khataniar ◽  
Daniel De Brito Dias ◽  
Rong Xu

Abstract A new implementation of a multiscale sequential fully implicit (MS SFI) reservoir simulation method is applied to a set of reservoir engineering problems to understand its utility. An assessment is made to highlight areas where the approach brings substantial advantage in performance as well as address problems not successfully resolved by existing methods. This work makes use of the first ever implementation of the multiscale sequential fully implicit method in a commercial reservoir simulator. The key features of the method and implementation are briefly discussed. The learnings gained during field testing and commercialization on about forty real world models is illustrated through simpler, but representative data sets, available in the public domain. The workhorse robust fully implicit (FI) method is used as a reference for benchmarking. The MS SFI method can faithfully reproduce FI results for black oil problems. We conclude that the MS SFI method has the capability to support reservoir engineering decision making especially in the areas of subsurface uncertainty quantification, representative model selection, model calibration and optimization. The MS SFI method shows immense potential for handling prominent levels of reservoir heterogeneity. The challenge of including fine-scale heterogeneity, which is often overlooked, when scaling up EOR processes from laboratory to field, appears to have found a practical solution with a combination of MS SFI and high-performance computing (HPC).


SPE Journal ◽  
2020 ◽  
Vol 25 (05) ◽  
pp. 2801-2821 ◽  
Author(s):  
Zaid Alrashdi ◽  
Karl D. Stephen

Summary Optimization is an essential task for field development planning because it has the potential to increase the economic value of the project. Because of advancements in technology, optimization is now shifting from manual to automated schemes. However, because of the complexity and high dimensionality of the problem of field-development optimization, the automated scheme receives less attention. In this paper, we demonstrate an increased efficiency of optimization algorithms in well-placement problems by application of reservoir-engineering practices. Analytical and empirical solutions to the flow equations as developed for reservoir engineering are added to the algorithm to improve the sampling efficiency of the algorithm. The main engineering methods that have been applied in this project are Buckley-Leverett (BL) and decline curve analysis (DCA). We begin by modifying and validating these analytical functions for different reservoir and well conditions. They are then applied in two stages: the initialization stage and optimization stage. In the former stage, numerous samples are created and evaluated by these analytical functions to identify good candidates for the initial population of the optimization algorithm. In the latter stage, these functions are used as a proxy model of the full simulation. The proposed algorithm is compared with a base-case approach in which no engineering-based analytical functions are used. The use of these engineering aspects is shown to be useful in both stages. The results also show that considerable computation time can be saved by applying these engineering aspects.


2019 ◽  
Vol 67 (1) ◽  
pp. 47-50
Author(s):  
Mohammad Amirul Islam ◽  
ASM Woobaidullah ◽  
Badrul Imam

In oil and gas industries there are available production data analysis tools and reservoir simulation techniques. Scientists and engineers use these tools and techniques to generate authentic and valuable information on the reservoir for planning and development. In this study production rate decline and streamline reservoir simulation are analyzed integrated way to determine the well life, flow rate, producible reserve, drainage volume and reserve. Well no SY-7 of Haripur oil field produced 0.531 million barrel of oil from 1987 to 1994. Exponential decline rate is matched with production profile and reveals well life of 10 years, producible reserve 0.7 million barrels. In addition, the drainage volume around the well is 158 million cubic feet estimated from the well life by using the streamline simulation, as well as the oil reserve 2 million barrels in the drainage volume is estimated. This reserve information carries value, authenticity and reliability for field development planning. Dhaka Univ. J. Sci. 67(1): 47-50, 2019 (January)


2017 ◽  
Vol 2017 ◽  
pp. 1-16 ◽  
Author(s):  
Nhat-Phuong Tran ◽  
Myungho Lee ◽  
Sugwon Hong

Lattice Boltzmann Method (LBM) is a powerful numerical simulation method of the fluid flow. With its data parallel nature, it is a promising candidate for a parallel implementation on a GPU. The LBM, however, is heavily data intensive and memory bound. In particular, moving the data to the adjacent cells in the streaming computation phase incurs a lot of uncoalesced accesses on the GPU which affects the overall performance. Furthermore, the main computation kernels of the LBM use a large number of registers per thread which limits the thread parallelism available at the run time due to the fixed number of registers on the GPU. In this paper, we develop high performance parallelization of the LBM on a GPU by minimizing the overheads associated with the uncoalesced memory accesses while improving the cache locality using the tiling optimization with the data layout change. Furthermore, we aggressively reduce the register uses for the LBM kernels in order to increase the run-time thread parallelism. Experimental results on the Nvidia Tesla K20 GPU show that our approach delivers impressive throughput performance: 1210.63 Million Lattice Updates Per Second (MLUPS).


2003 ◽  
Vol 2003.3 (0) ◽  
pp. 161-162
Author(s):  
Hiroshi IKEDA ◽  
Takeshi SHIMIZU ◽  
Tadashi NARABAYASHI ◽  
Koji NISHIDA ◽  
Toshihiko FUKUDA ◽  
...  

2009 ◽  
Vol 12 (01) ◽  
pp. 149-158 ◽  
Author(s):  
Dean C. Rietz ◽  
Adnan H. Usmani

Summary Continuous improvements in reservoir simulation software and the availability of high performance computing equipment are making the use of simulation models commonplace for field development and planning purposes. Naturally, this trend has also increased interest in the use of reservoir simulation model results in the oil and gas reserves estimation process. As simulation specialists who work in a primarily reserves-evaluation company, the authors are routinely asked to evaluate, and in many cases incorporate, simulation results in the reserves estimation process. In addition, the authors are required to opine on the approach and tactics used by clients while they incorporate numerical models in their reserves bookings. Because limited published discussion exists on this topic, the purpose of this paper is to provide some examples of the approach used by the authors. We believe this approach to be appropriate and within the spirit of reserves interpretation as used by typical reserves regulatory bodies such as the U.S. Securities and Exchange Commission (SEC). Papers previously published have discussed the use of models in the reserves process, including the evaluation of the models themselves (Palke and Rietz 2001; Rietz and Usmani 2005). In contrast, this paper provides three case studies that illustrate how results from various models have been used to assist in quantifying reserves. Two of the examples are based on history-matched models, while the third focuses on a pre-production reservoir where no adequate history is available and probabilistic methods were incorporated to help understand the uncertainty in the forecasts. While there is no "cookbook" or step-by-step procedure for using simulation results to estimate reserves, the case studies presented in this paper are intended to both show some examples and also spark some debate and discussion. Undoubtedly there will be some disagreement with our techniques, but an open discussion should prove to be beneficial for both reserves evaluators and simulation specialists.


2012 ◽  
Vol 43 (1-2) ◽  
pp. 54-63 ◽  
Author(s):  
Baohong Lu ◽  
Huanghe Gu ◽  
Ziyin Xie ◽  
Jiufu Liu ◽  
Lejun Ma ◽  
...  

Stochastic simulation is widely applied for estimating the design flood of various hydrosystems. The design flood at a reservoir site should consider the impact of upstream reservoirs, along with any development of hydropower. This paper investigates and applies a stochastic simulation approach for determining the design flood of a complex cascade of reservoirs in the Longtan watershed, southern China. The magnitude of the design flood when the impact of the upstream reservoirs is considered is less than that without considering them. In particular, the stochastic simulation model takes into account both systematic and historical flood records. As the reliability of the frequency analysis increases with more representative samples, it is desirable to incorporate historical flood records, if available, into the stochastic simulation model. This study shows that the design values from the stochastic simulation method with historical flood records are higher than those without historical flood records. The paper demonstrates the advantages of adopting a stochastic flow simulation approach to address design-flood-related issues for a complex cascade reservoir system.


Author(s):  
Seyed Kourosh Mahjour ◽  
Antonio Alberto Souza Santos ◽  
Manuel Gomes Correia ◽  
Denis José Schiozer

AbstractThe simulation process under uncertainty needs numerous reservoir models that can be very time-consuming. Hence, selecting representative models (RMs) that show the uncertainty space of the full ensemble is required. In this work, we compare two scenario reduction techniques: (1) Distance-based Clustering with Simple Matching Coefficient (DCSMC) applied before the simulation process using reservoir static data, and (2) metaheuristic algorithm (RMFinder technique) applied after the simulation process using reservoir dynamic data. We use these two methods as samples to investigate the effect of static and dynamic data usage on the accuracy and rate of the scenario reduction process focusing field development purposes. In this work, a synthetic benchmark case named UNISIM-II-D considering the flow unit modelling is used. The results showed both scenario reduction methods are reliable in selecting the RMs from a specific production strategy. However, the obtained RMs from a defined strategy using the DCSMC method can be applied to other strategies preserving the representativeness of the models, while the role of the strategy types to select the RMs using the metaheuristic method is substantial so that each strategy has its own set of RMs. Due to the field development workflow in which the metaheuristic algorithm is used, the number of required flow simulation models and the computational time are greater than the workflow in which the DCSMC method is applied. Hence, it can be concluded that static reservoir data usage on the scenario reduction process can be more reliable during the field development phase.


Sign in / Sign up

Export Citation Format

Share Document