scholarly journals Multi-Fidelity Bayesian Approach for History Matching in Reservoir Simulation

2021 ◽  
Author(s):  
Ryan Santoso ◽  
Xupeng He ◽  
Marwa Alsinan ◽  
Ruben Figueroa Hernandez ◽  
Hyung Kwak ◽  
...  

Abstract History matching is a critical step within the reservoir management process to synchronize the simulation model with the production data. The history-matched model can be used for planning optimum field development and performing optimization and uncertainty quantifications. We present a novel history matching workflow based on a Bayesian framework that accommodates subsurface uncertainties. Our workflow involves three different model resolutions within the Bayesian framework: 1) a coarse low-fidelity model to update the prior range, 2) a fine low-fidelity model to represent the high-fidelity model, and 3) a high-fidelity model to re-construct the real response. The low-fidelity model is constructed by a multivariate polynomial function, while the high-fidelity model is based on the reservoir simulation model. We firstly develop a coarse low-fidelity model using a two-level Design of Experiment (DoE), which aims to provide a better prior. We secondly use Latin Hypercube Sampling (LHS) to construct the fine low-fidelity model to be deployed in the Bayesian runs, where we use the Metropolis-Hastings algorithm. Finally, the posterior is fed into the high-fidelity model to evaluate the matching quality. This work demonstrates the importance of including uncertainties in history matching. Bayesian provides a robust framework to allow uncertainty quantification within the reservoir history matching. Under uniform prior, the convergence of the Bayesian is very sensitive to the parameter ranges. When the solution is far from the mean of the parameter ranges, the Bayesian introduces bios and deviates from the observed data. Our results show that updating the prior from the coarse low-fidelity model accelerates the Bayesian convergence and improves the matching convergence. Bayesian requires a huge number of runs to produce an accurate posterior. Running the high-fidelity model multiple times is expensive. Our workflow tackles this problem by deploying a fine low-fidelity model to represent the high-fidelity model in the main runs. This fine low-fidelity model is fast to run, while it honors the physics and accuracy of the high-fidelity model. We also use ANOVA sensitivity analysis to measure the importance of each parameter. The ranking gives awareness to the significant ones that may contribute to the matching accuracy. We demonstrate our workflow for a geothermal reservoir with static and operational uncertainties. Our workflow produces accurate matching of thermal recovery factor and produced-enthalpy rate with physically-consistent posteriors. We present a novel workflow to account for uncertainty in reservoir history matching involving multi-resolution interaction. The proposed method is generic and can be readily applied within existing history-matching workflows in reservoir simulation.

2021 ◽  
Author(s):  
Obinna Somadina Ezeaneche ◽  
Robinson Osita Madu ◽  
Ishioma Bridget Oshilike ◽  
Orrelo Jerry Athoja ◽  
Mike Obi Onyekonwu

Abstract Proper understanding of reservoir producing mechanism forms a backbone for optimal fluid recovery in any reservoir. Such an understanding is usually fostered by a detailed petrophysical evaluation, structural interpretation, geological description and modelling as well as production performance assessment prior to history matching and reservoir simulation. In this study, gravity drainage mechanism was identified as the primary force for production in reservoir X located in Niger Delta province and this required proper model calibration using variation of vertical anisotropic ratio based on identified facies as against a single value method which does not capture heterogeneity properly. Using structural maps generated from interpretation of seismic data, and other petrophysical parameters from available well logs and core data such as porosity, permeability and facies description based on environment of deposition, a geological model capturing the structural dips, facies distribution and well locations was built. Dynamic modeling was conducted on the base case model and also on the low and high case conceptual models to capture different structural dips of the reservoir. The result from history matching of the base case model reveals that variation of vertical anisotropic ratio (i.e. kv/kh) based on identified facies across the system is more effective in capturing heterogeneity than using a deterministic value that is more popular. In addition, gas segregated fastest in the high case model with the steepest dip compared to the base and low case models. An improved dynamic model saturation match was achieved in line with the geological description and the observed reservoir performance. Quick wins scenarios were identified and this led to an additional reserve yield of over 1MMSTB. Therefore, structural control, facies type, reservoir thickness and nature of oil volatility are key forces driving the gravity drainage mechanism.


1986 ◽  
Vol 26 (1) ◽  
pp. 447
Author(s):  
A.M. Younes ◽  
G.O. Morrell ◽  
A.B. Thompson

The West Kingfish Field in the Gippsland Basin, offshore Victoria, has been developed from the West King-fish platform by Esso Australia Ltd (operator) and BHP Petroleum.The structure is an essentially separate, largely stratigraphic accumulation that forms the western flank of the Kingfish feature. A total of 19 development wells were drilled from the West Kingfish platform between October 1982 and May 1984. Information provided by these wells was used in a West Kingfish post-development geologic study and a reservoir simulation study.As a result of these studies the estimated recoverable oil volume has been increased 55 per cent to 27.0 stock tank gigalitres (170 million stock tank barrels). The studies also formed the technical basis for obtaining new oil classification of the P-1.1 reservoir which is the only sand body that has been found in the Gurnard Formation in the Kingfish area.The simulation study was accomplished with an extremely high level of efficiency due to the extensive and effective use of computer graphics technology in model construction, history matching and predictions.Computer graphics technology has also been used very effectively in presenting the simulation study results in an understandable way to audiences with various backgrounds. A portable microcomputer has been used to store hundreds of graphic displays which are projected with a large screen video projector.Presentations using this new display technology have been well received and have been very successful in conveying the results of a complex reservoir simulation study and in identifying future field development opportunities to audiences with various backgrounds.


Author(s):  
Denis José Schiozer ◽  
Antonio Alberto de Souza dos Santos ◽  
Susana Margarida de Graça Santos ◽  
João Carlos von Hohendorff Filho

This work describes a new methodology for integrated decision analysis in the development and management of petroleum fields considering reservoir simulation, risk analysis, history matching, uncertainty reduction, representative models, and production strategy selection under uncertainty. Based on the concept of closed-loop reservoir management, we establish 12 steps to assist engineers in model updating and production optimization under uncertainty. The methodology is applied to UNISIM-I-D, a benchmark case based on the Namorado field in the Campos Basin, Brazil. The results show that the method is suitable for use in practical applications of complex reservoirs in different field stages (development and management). First, uncertainty is characterized in detail and then scenarios are generated using an efficient sampling technique, which reduces the number of evaluations and is suitable for use with numerical reservoir simulation. We then perform multi-objective history-matching procedures, integrating static data (geostatistical realizations generated using reservoir information) and dynamic data (well production and pressure) to reduce uncertainty and thus provide a set of matched models for production forecasts. We select a small set of Representative Models (RMs) for decision risk analysis, integrating reservoir, economic and other uncertainties to base decisions on risk-return techniques. We optimize the production strategies for (1) each individual RM to obtain different specialized solutions for field development and (2) all RMs simultaneously in a probabilistic procedure to obtain a robust strategy. While the second approach ensures the best performance under uncertainty, the first provides valuable insights for the expected value of information and flexibility analyses. Finally, we integrate reservoir and production systems to ensure realistic production forecasts. This methodology uses reservoir simulations, not proxy models, to reliably predict field performance. The proposed methodology is efficient, easy-to-use and compatible with real-time operations, even in complex cases where the computational time is restrictive.


2021 ◽  
Author(s):  
Bjørn Egil Ludvigsen ◽  
Mohan Sharma

Abstract Well performance calibration after history matching a reservoir simulation model ensures that the wells give realistic rates during the prediction phase. The calibration involves adjusting well model parameters to match observed production rates at specified backpressure(s). This process is usually very time consuming such that the traditional approaches using one reservoir model with hundreds of high productivity wells would take months to calibrate. The application of uncertainty-centric workflows for reservoir modeling and history matching results in many acceptable matches for phase rates and flowing bottom-hole pressure (BHP). This makes well calibration even more challenging for an ensemble of large number of simulation models, as the existing approaches are not scalable. It is known that Productivity Index (PI) integrates reservoir and well performance where most of the pressure drop happens in one to two grid blocks around well depending upon the model resolution. A workflow has been setup to fix transition by calibrating PI for each well in a history matched simulation model. Simulation PI can be modified by changing permeability-thickness (Kh), skin, or by applying PI multiplier as a correction. For a history matched ensemble with a range in water-cut and gas-oil ratio, the proposed workflow involves running flowing gradient calculations for a well corresponding to observed THP and simulated rates for different phases to calculate target BHP. A PI Multiplier is then calculated for that well and model that would shift simulation BHP to target BHP as local update to reduce the extent of jump. An ensemble of history matched models with a range in water-cut and gas-oil ratio have a variation in required BHPs unique to each case. With the well calibration performed correctly, the jump observed in rates while switching from history to prediction can be eliminated or significantly reduced. The prediction thus results in reliable rates if wells are run on pressure control and reliable plateau if the wells are run on group control. This reduces the risk of under/over-predicting ultimate hydrocarbon recovery from field and the project's cashflow. Also, this allows running sensitivities to backpressure, tubing design, and other equipment constraints to optimize reservoir performance and facilities design. The proposed workflow, which dynamically couple reservoir simulation and well performance modeling, takes a few seconds to run for a well, making it fit-for-purpose for a large ensemble of simulation models with a large number of wells.


2021 ◽  
Author(s):  
Mohamed Shams

Abstract This paper provides the field application of the bee colony optimization algorithm in assisting the history match of a real reservoir simulation model. Bee colony optimization algorithm is an optimization technique inspired by the natural optimization behavior shown by honeybees during searching for food. The way that honeybees search for food sources in the vicinity of their nest inspired computer science researchers to utilize and apply same principles to create optimization models and techniques. In this work the bee colony optimization mechanism is used as the optimization algorithm in the assisted the history matching workflow applied to a reservoir simulation model of WD-X field producing since 2004. The resultant history matched model is compared with with those obtained using one the most widely applied commercial AHM software tool. The results of this work indicate that using the bee colony algorithm as the optimization technique in the assisted history matching workflow provides noticeable enhancement in terms of match quality and time required to achieve a reasonable match.


2018 ◽  
Vol 6 (3) ◽  
pp. T601-T611
Author(s):  
Juliana Maia Carvalho dos Santos ◽  
Alessandra Davolio ◽  
Denis Jose Schiozer ◽  
Colin MacBeth

Time-lapse (or 4D) seismic attributes are extensively used as inputs to history matching workflows. However, this integration can potentially bring problems if performed incorrectly. Some of the uncertainties regarding seismic acquisition, processing, and interpretation can be inadvertently incorporated into the reservoir simulation model yielding an erroneous production forecast. Very often, the information provided by 4D seismic can be noisy or ambiguous. For this reason, it is necessary to estimate the level of confidence on the data prior to its transfer to the simulation model process. The methodology presented in this paper aims to diagnose which information from 4D seismic that we are confident enough to include in the model. Two passes of seismic interpretation are proposed: the first, intended to understand the character and quality of the seismic data and, the second, to compare the simulation-to-seismic synthetic response with the observed seismic signal. The methodology is applied to the Norne field benchmark case in which we find several examples of inconsistencies between the synthetic and real responses and we evaluate whether these are caused by a simulation model inaccuracy or by uncertainties in the actual observed seismic. After a careful qualitative and semiquantitative analysis, the confidence level of the interpretation is determined. Simulation model updates can be suggested according to the outcome from this analysis. The main contribution of this work is to introduce a diagnostic step that classifies the seismic interpretation reliability considering the uncertainties inherent in these data. The results indicate that a medium to high interpretation confidence can be achieved even for poorly repeated data.


Sign in / Sign up

Export Citation Format

Share Document