Successful Application of Honey-Bee Optimization Technique in Reservoir Engineering Assisted History Matching: Case Study

2021 ◽  
Author(s):  
Mohamed Shams

Abstract This paper provides the field application of the bee colony optimization algorithm in assisting the history match of a real reservoir simulation model. Bee colony optimization algorithm is an optimization technique inspired by the natural optimization behavior shown by honeybees during searching for food. The way that honeybees search for food sources in the vicinity of their nest inspired computer science researchers to utilize and apply same principles to create optimization models and techniques. In this work the bee colony optimization mechanism is used as the optimization algorithm in the assisted the history matching workflow applied to a reservoir simulation model of WD-X field producing since 2004. The resultant history matched model is compared with with those obtained using one the most widely applied commercial AHM software tool. The results of this work indicate that using the bee colony algorithm as the optimization technique in the assisted history matching workflow provides noticeable enhancement in terms of match quality and time required to achieve a reasonable match.

2021 ◽  
Author(s):  
Mohamed Shams ◽  
Ahmed El-Banbi ◽  
M. Helmy Sayyouh

Abstract Bee colony optimization technique is a stochastic population-based optimization algorithm inspired by the natural optimization behavior shown by honey bees during searching for food. Bee colony optimization algorithm has been successfully applied to various real-world optimization problems mostly in routing, transportation, and scheduling fields. This paper introduces the bee colony optimization method as the optimization technique in reservoir engineering assisted history matching procedure. The superiority of the proposed optimization algorithm is validated by comparing its performance with two other advanced nature-inspired optimization techniques (genetic and particle swarm optimization algorithms) in three synthetic assisted history matching problems. In addition, this paper presents the application of the bee colony optimization technique in assisting the history match of a full field reservoir simulation model of a mature gas-cap reservoir with 28 years of history. The resultant history matched model is compared with those obtained using a manual history matching procedure and using the most widely applied optimization algorithm used in assisted history matching commercial software tools. The results of this work indicate that employing the bee colony algorithm as the optimization technique in the assisted history matching workflow yields noticeable enhancement in terms of match quality and time required to achieve a reasonable match.


2021 ◽  
Author(s):  
Bjørn Egil Ludvigsen ◽  
Mohan Sharma

Abstract Well performance calibration after history matching a reservoir simulation model ensures that the wells give realistic rates during the prediction phase. The calibration involves adjusting well model parameters to match observed production rates at specified backpressure(s). This process is usually very time consuming such that the traditional approaches using one reservoir model with hundreds of high productivity wells would take months to calibrate. The application of uncertainty-centric workflows for reservoir modeling and history matching results in many acceptable matches for phase rates and flowing bottom-hole pressure (BHP). This makes well calibration even more challenging for an ensemble of large number of simulation models, as the existing approaches are not scalable. It is known that Productivity Index (PI) integrates reservoir and well performance where most of the pressure drop happens in one to two grid blocks around well depending upon the model resolution. A workflow has been setup to fix transition by calibrating PI for each well in a history matched simulation model. Simulation PI can be modified by changing permeability-thickness (Kh), skin, or by applying PI multiplier as a correction. For a history matched ensemble with a range in water-cut and gas-oil ratio, the proposed workflow involves running flowing gradient calculations for a well corresponding to observed THP and simulated rates for different phases to calculate target BHP. A PI Multiplier is then calculated for that well and model that would shift simulation BHP to target BHP as local update to reduce the extent of jump. An ensemble of history matched models with a range in water-cut and gas-oil ratio have a variation in required BHPs unique to each case. With the well calibration performed correctly, the jump observed in rates while switching from history to prediction can be eliminated or significantly reduced. The prediction thus results in reliable rates if wells are run on pressure control and reliable plateau if the wells are run on group control. This reduces the risk of under/over-predicting ultimate hydrocarbon recovery from field and the project's cashflow. Also, this allows running sensitivities to backpressure, tubing design, and other equipment constraints to optimize reservoir performance and facilities design. The proposed workflow, which dynamically couple reservoir simulation and well performance modeling, takes a few seconds to run for a well, making it fit-for-purpose for a large ensemble of simulation models with a large number of wells.


2018 ◽  
Vol 6 (3) ◽  
pp. T601-T611
Author(s):  
Juliana Maia Carvalho dos Santos ◽  
Alessandra Davolio ◽  
Denis Jose Schiozer ◽  
Colin MacBeth

Time-lapse (or 4D) seismic attributes are extensively used as inputs to history matching workflows. However, this integration can potentially bring problems if performed incorrectly. Some of the uncertainties regarding seismic acquisition, processing, and interpretation can be inadvertently incorporated into the reservoir simulation model yielding an erroneous production forecast. Very often, the information provided by 4D seismic can be noisy or ambiguous. For this reason, it is necessary to estimate the level of confidence on the data prior to its transfer to the simulation model process. The methodology presented in this paper aims to diagnose which information from 4D seismic that we are confident enough to include in the model. Two passes of seismic interpretation are proposed: the first, intended to understand the character and quality of the seismic data and, the second, to compare the simulation-to-seismic synthetic response with the observed seismic signal. The methodology is applied to the Norne field benchmark case in which we find several examples of inconsistencies between the synthetic and real responses and we evaluate whether these are caused by a simulation model inaccuracy or by uncertainties in the actual observed seismic. After a careful qualitative and semiquantitative analysis, the confidence level of the interpretation is determined. Simulation model updates can be suggested according to the outcome from this analysis. The main contribution of this work is to introduce a diagnostic step that classifies the seismic interpretation reliability considering the uncertainties inherent in these data. The results indicate that a medium to high interpretation confidence can be achieved even for poorly repeated data.


SPE Journal ◽  
2008 ◽  
Vol 13 (04) ◽  
pp. 382-391 ◽  
Author(s):  
Vibeke Eilwn J. Haugen ◽  
Geir Naevdal ◽  
Lars-Joergen Natvik ◽  
Geir Evensen ◽  
Aina M. Berg ◽  
...  

Summary This paper applies the ensemble Kalman filter (EnKF) to history match a North Sea field model. This is, as far as we know, one of the first published studies in which the EnKF is applied in a realistic setting using real production data. The reservoir-simulation model has approximately 45,000 active grid cells, and 5 years of production data are assimilated. The estimated parameters consist of the permeability and porosity fields, and the results are compared with a model previously established using a manual history-matching procedure. It was found that the EnKF estimate improved the match to the production data. This study, therefore, supported previous findings when using synthetic models that the EnKF may provide a useful tool for history matching reservoir parameters such as the permeability and porosity fields. Introduction The EnKF developed by Evensen (1994, 2003, 2007) is a statistical method suitable for data assimilation in large-scale nonlinear models. It is a Monte Carlo method, where model uncertainty is represented by an ensemble of realizations. The prediction of the estimate and uncertainty is performed by ensemble integration using the reservoir-simulation model. The method provides error estimates at any time based on information from the ensemble. When production data are available, a variance-minimizing scheme is used to update the realizations. The EnKF provides a general and model-independent formulation and can be used to improve the estimates of both the parameters and variables in the model. The method has previously been applied in a number of applications [e.g., in dynamical ocean models (Haugen and Evensen 2002), in model systems describing the ocean ecosystems (Natvik and Evensen 2003a, 2003b), and in applications within meteorology (Houtekamer et al. 2005)]. This shows that the EnKF is capable of handling different types of complex- and nonlinear-model systems. The method was first introduced into the petroleum industry in studies related to well-flow modeling (Lorentzen et al. 2001, 2003). Nævdal et al. (2002) used the EnKF in a reservoir application to estimate model permeability focusing on a near-well reservoir model. They showed that there could be a great benefit from using the EnKF to improve the model through parameter estimation, and that this could lead to improved predictions. Nævdal et al. (2005) showed promising results estimating the permeability as a continuous field variable in a 2D field-like example. Gu and Oliver (2005) examined the EnKF for combined parameter and state estimation in a standardized reservoir test case. Gao et al. (2006) compared the EnKF with the randomized-maximum-likelihood method and pointed out several similarities between the methods. Liu and Oliver (2005a, 2005b) examined the EnKF for facies estimation in a reservoir-simulation model. This is a highly nonlinear problem where the probability-density function for the petrophysical properties becomes multimodal, and it is not clear how the EnKF can best handle this. A method was proposed in which the facies distribution for each ensemble member is represented by two normal distributed Gaussian fields using a method called truncated pluri-Gaussian simulation (Lantuéjoul 2002). Wen and Chen (2006) provided another discussion on the EnKF for estimation of the permeability field in a 2D reservoir-simulation model and examined the effect of the ensemble size. Lorentzen et al. (2005) focused on the sensitivity of the results with respect to the choice of initial ensemble using the PUNQ-S3. Skjervheim et al. (2007) used the EnKF to assimilate seismic 4D data. It was shown that the EnKF can handle these large data sets and that a positive impact could be found despite the high noise level in the data. The EnKF has some important advantages when compared to traditional assisted history-matching methods; the result is an ensemble of history-matched models that are all possible model realizations. The data are processed sequentially in time, meaning that new data are easily accounted for when they arrive. The method allows for simultaneous estimation of a huge number of poorly known parameters such as fields of properties defined in each grid cell. By analyzing the EnKF update equations, it is seen that the actual degrees of freedom in the estimation problem are limited equal to the ensemble size. One is still able to update the most important features of large-scale models. A limitation of the EnKF is the fact that its computations are based on first- and second-order moments, and there are problems that are difficult to handle, particularly when the probability distributions are multimodal (e.g., when representing a bimodal channel facies distribution). This paper considers the use of the EnKF for estimating dynamic and static parameters, focusing on permeability and porosity, in a field model of a StatoilHydro-operated field in the North Sea. The largest uncertainty in the model is expected to be related to the permeability values, especially in the upper part of the reservoir where the uncertainty may be as large as 30%.


Sign in / Sign up

Export Citation Format

Share Document