History Matching Time-Lapse Surface-Gravity and Well-Pressure Data With Ensemble Smoother for Estimating Gas Field Aquifer Support—A 3D Numerical Study

SPE Journal ◽  
2012 ◽  
Vol 17 (04) ◽  
pp. 966-980 ◽  
Author(s):  
M.. Glegola ◽  
P.. Ditmar ◽  
R.G.. G. Hanea ◽  
O.. Eiken ◽  
F.C.. C. Vossepoel ◽  
...  

Summary Water influx is an important factor influencing production of gas reservoirs with an active aquifer. However, aquifer properties such as size, porosity, and permeability are typically uncertain and make predictions of field performance challenging. The observed pressure decline is inherently nonunique with respect to water influx, and large uncertainties in the actual reservoir state are common. Time-lapse (4D) gravimetry, which is a direct measure of a subsurface mass redistribution, has the potential to provide valuable information in this context. Recent improvements in instrumentation and data-acquisition and -processing procedures have made time-lapse gravimetry a mature monitoring technique, both for land and offshore applications. However, despite an increasing number of gas fields in which gravimetric monitoring has been applied, little has been published on the added value of gravity data in a broader context of modern reservoir management on the basis of the closed-loop concept. The way in which gravity data can contribute to improved reservoir characterization, production-forecast accuracy, and hydrocarbon-reserves estimation is still to be addressed in many respects. In this paper, we investigate the added value of gravimetric observations for gas-field-production monitoring and aquifer-support estimation. We perform a numerical study with a realistic 3D gas field model that contains a large and complex aquifer system. The aquifer support and other reservoir parameters (i.e., porosity, permeability, reservoir top and bottom horizons) are estimated simultaneously using the ensemble smoother (ES). We consider three cases in which gravity only is assimilated, pressure only is assimilated, and gravity and pressure data are assimilated jointly. We show that a combined estimation of the aquifer support with the permeability field, porosity field, and reservoir structure is a very challenging and nonunique history-matching problem, in which gravity certainly has an added value. Pressure data alone may not discriminate between different reservoir scenarios. Combining pressure and gravity data may help to reduce the nonuniqueness problem and provide not only an improved gas- and water-production forecast and gas-in-place evaluation, but also a more-accurate reservoir-state description.

SPE Journal ◽  
2011 ◽  
Vol 17 (01) ◽  
pp. 163-176 ◽  
Author(s):  
M.. Glegola ◽  
P.. Ditmar ◽  
R.G.. G. Hanea ◽  
F.C.. C. Vossepoel ◽  
R.. Arts ◽  
...  

Summary Water influx into gas fields can reduce recovery factors by 10–40%. Therefore, information about the magnitude and spatial distribution of water influx is essential for efficient management of waterdrive gas reservoirs. Modern geophysical techniques such as gravimetry may provide a direct measure of mass redistribution below the surface, yielding additional and valuable information for reservoir monitoring. In this paper, we investigate the added value of gravimetric observations for water-influx monitoring into a gas field. For this purpose, we use data assimilation with the ensemble Kalman filter (EnKF) method. To understand better the limitations of the gravimetric technique, a sensitivity study is performed. For a simplified gas-reservoir model, we assimilate the synthetic gravity measurements and estimate reservoir permeability. The updated reservoir model is used to predict the water-front position. We consider a number of possible scenarios, making various assumptions on the level of gravity measurement noise and on the distance from the gravity observation network to the reservoir formation. The results show that with increasing gravimetric noise and/or distance, the updated model permeability becomes smoother and its variance higher. Finally, we investigate the effect of a combined assimilation of gravity and production data. In the case when only production observations are used, the permeability estimates far from the wells can be erroneous, despite a very accurate history match of the data. In the case when both production and gravity data are combined within a single data assimilation framework, we obtain a considerably improved estimation of the reservoir permeability and an improved understanding of the subsurface mass flow. These results illustrate the complementarity of both types of measurements, and more generally, the experiments show clearly the added value of gravity data for monitoring water influx into a gas field.


2007 ◽  
Vol 10 (01) ◽  
pp. 77-85 ◽  
Author(s):  
Tomomi Yamada ◽  
Yoshiyuki Okano

Summary A Tcf-class gas field has been producing over several decades in Japan. The reservoir body comprises stacked rhyolite lava domes erupted in a submarine environment. A porous network developed in each dome and rapid chilling on contact with seawater caused hyaloclastite to be deposited over it. Although hyaloclastite is also porous in this field, its permeability has been reduced dramatically by the presence of clay minerals. Impermeable basaltic sheets and mudstone seams are also present. Each facies plays a specific role in the pressure system. Stratigraphic correlation originally identified multiple reservoirs. Gas has been produced almost exclusively from the largest one. However, following 10 to 20 years of production, the pressures within unexploited reservoirs were noticed to have declined at a variety of rates. Unusual localized behavior has also been observed. Because seismic data were not proved particularly informative, we decided to remodel the entire system by specifically using pressure data. We employed a combination of multipoint geostatistics and probability perturbation theories. This approach successfully captured the curved facies boundaries within stacked lava domes while accounting for pressure data by means of history matching to address nonstationarity in the real field. Building a suitable training image is commonly a difficult aspect of multipoint methods and poses particular problems for volcanic reservoirs. It was accomplished here by iteratively adjusting the prototype until satisfactory history matching was achieved with a reasonable number of perturbations. Ambiguous reservoir boundaries were represented stochastically by populating a predetermined model space with pay and nonpay pixels. The modeling results closely simulate measured pressure histories and appear realistic in terms of both facies distributions and reservoir boundaries. They suggest that uneven pressure declines between different units are caused by the tortuous flow channels that connect them. The results also account for the unusual smaller-scale pressure performances observed. The final training image obtained here indicates more intensive spatial variations in facies than previously appreciated. Original gas in place (OGIP) estimates made with 20 equiprobable realizations are scattered within ±15% of the mean value. Estimates of incremental recovery made by drilling a step-out well reveal greater variation than those made by installing a booster compressor, which quantifies a higher associated geological risk.


2018 ◽  
Vol 6 (3) ◽  
pp. T601-T611
Author(s):  
Juliana Maia Carvalho dos Santos ◽  
Alessandra Davolio ◽  
Denis Jose Schiozer ◽  
Colin MacBeth

Time-lapse (or 4D) seismic attributes are extensively used as inputs to history matching workflows. However, this integration can potentially bring problems if performed incorrectly. Some of the uncertainties regarding seismic acquisition, processing, and interpretation can be inadvertently incorporated into the reservoir simulation model yielding an erroneous production forecast. Very often, the information provided by 4D seismic can be noisy or ambiguous. For this reason, it is necessary to estimate the level of confidence on the data prior to its transfer to the simulation model process. The methodology presented in this paper aims to diagnose which information from 4D seismic that we are confident enough to include in the model. Two passes of seismic interpretation are proposed: the first, intended to understand the character and quality of the seismic data and, the second, to compare the simulation-to-seismic synthetic response with the observed seismic signal. The methodology is applied to the Norne field benchmark case in which we find several examples of inconsistencies between the synthetic and real responses and we evaluate whether these are caused by a simulation model inaccuracy or by uncertainties in the actual observed seismic. After a careful qualitative and semiquantitative analysis, the confidence level of the interpretation is determined. Simulation model updates can be suggested according to the outcome from this analysis. The main contribution of this work is to introduce a diagnostic step that classifies the seismic interpretation reliability considering the uncertainties inherent in these data. The results indicate that a medium to high interpretation confidence can be achieved even for poorly repeated data.


Geophysics ◽  
2019 ◽  
Vol 85 (1) ◽  
pp. M15-M31 ◽  
Author(s):  
Mingliang Liu ◽  
Dario Grana

We have developed a time-lapse seismic history matching framework to assimilate production data and time-lapse seismic data for the prediction of static reservoir models. An iterative data assimilation method, the ensemble smoother with multiple data assimilation is adopted to iteratively update an ensemble of reservoir models until their predicted observations match the actual production and seismic measurements and to quantify the model uncertainty of the posterior reservoir models. To address computational and numerical challenges when applying ensemble-based optimization methods on large seismic data volumes, we develop a deep representation learning method, namely, the deep convolutional autoencoder. Such a method is used to reduce the data dimensionality by sparsely and approximately representing the seismic data with a set of hidden features to capture the nonlinear and spatial correlations in the data space. Instead of using the entire seismic data set, which would require an extremely large number of models, the ensemble of reservoir models is iteratively updated by conditioning the reservoir realizations on the production data and the low-dimensional hidden features extracted from the seismic measurements. We test our methodology on two synthetic data sets: a simplified 2D reservoir used for method validation and a 3D application with multiple channelized reservoirs. The results indicate that the deep convolutional autoencoder is extremely efficient in sparsely representing the seismic data and that the reservoir models can be accurately updated according to production data and the reparameterized time-lapse seismic data.


Geophysics ◽  
2015 ◽  
Vol 80 (2) ◽  
pp. WA69-WA83 ◽  
Author(s):  
Joseph Capriotti ◽  
Yaoguo Li

Understanding reservoir properties plays a key role in managing a reservoir’s resources and optimizing production. History matching is an important means for characterizing those properties. We developed a method to invert for the distribution of permeability directly from time-lapse gravity data. In this process, we used fluid flow in a porous medium coupled with forward modeling of the time-lapse gravity response as the forward operator, and then we solved a nonlinear inversion to reconstruct the permeability distribution in the reservoir. We were able to formulate the deterministic inversion as a Tikhonov regularization problem because of the relatively low computational cost of forward modeling time-lapse gravity data. The inversion can combine the information from time-lapse gravity and injection-production data sets to determine a static state of the reservoir described by the permeability. The resulting model satisfied all data sets simultaneously while obeying the mechanics of fluid flow through a porous medium. The inverse formulation also enabled the estimation of the uncertainty of the constructed permeability model with respect to the data, and our numerical simulations indicated that the information content in the two dynamic data sets appeared to be sufficiently high to constrain the recovery of permeability distribution.


SPE Journal ◽  
2019 ◽  
Vol 25 (01) ◽  
pp. 119-138 ◽  
Author(s):  
Yanhui Zhang ◽  
Femke C. Vossepoel ◽  
Ibrahim Hoteit

Summary An ensemble-based history-matching framework is proposed to enhance the characterization of petroleum reservoirs through the assimilation of crosswell electromagnetic (EM) data. As an advanced technology in reservoir surveillance, crosswell EM tomography can be used to estimate a cross-sectional conductivity map and associated saturation profile at an interwell scale by exploiting the sharp contrast in conductivity between hydrocarbons and saline water. Incorporating this information into reservoir simulation in combination with other available observations is expected to enhance the forecasting capability of reservoir models and to lead to better quantification of uncertainty. The proposed approach applies ensemble-based data-assimilation methods to build a robust and flexible framework in which various sources of available measurements can be integrated. A comparative study evaluates two different implementations of the assimilation of crosswell EM data. The first approach integrates the crosswell EM field components in their original form, which entails forward simulation of the observed EM responses from the simulated reservoir state. In the second approach, formation conductivity is derived from the EM data through inversion and is subsequently assimilated into the reservoir model. An image-oriented distance parameterization of the fluid front assimilates the conductivity field in an efficient and robust manner and overcomes issues with data size, errors, and their correlation. Numerical experiments for different test cases with increasing complexity provide insight into the performance of the two proposed integration schemes. The results demonstrate the efficiency of the developed history-matching workflow and the added value of crosswell EM data in enhancing the reservoir characterization and reliability of dynamic reservoir forecasts.


Energies ◽  
2021 ◽  
Vol 14 (11) ◽  
pp. 3137
Author(s):  
Amine Tadjer ◽  
Reider B. Bratvold ◽  
Remus G. Hanea

Production forecasting is the basis for decision making in the oil and gas industry, and can be quite challenging, especially in terms of complex geological modeling of the subsurface. To help solve this problem, assisted history matching built on ensemble-based analysis such as the ensemble smoother and ensemble Kalman filter is useful in estimating models that preserve geological realism and have predictive capabilities. These methods tend, however, to be computationally demanding, as they require a large ensemble size for stable convergence. In this paper, we propose a novel method of uncertainty quantification and reservoir model calibration with much-reduced computation time. This approach is based on a sequential combination of nonlinear dimensionality reduction techniques: t-distributed stochastic neighbor embedding or the Gaussian process latent variable model and clustering K-means, along with the data assimilation method ensemble smoother with multiple data assimilation. The cluster analysis with t-distributed stochastic neighbor embedding and Gaussian process latent variable model is used to reduce the number of initial geostatistical realizations and select a set of optimal reservoir models that have similar production performance to the reference model. We then apply ensemble smoother with multiple data assimilation for providing reliable assimilation results. Experimental results based on the Brugge field case data verify the efficiency of the proposed approach.


2021 ◽  
Author(s):  
Koki Oikawa ◽  
Hirotaka Saito ◽  
Seiichiro Kuroda ◽  
Kazunori Takahashi

<p>As an array antenna ground penetrating radar (GPR) system electronically switches any antenna combinations sequentially in milliseconds, multi-offset gather data, such as common mid-point (CMP) data, can be acquired almost seamlessly. However, due to the inflexibility of changing the antenna offset, only a limited number of scans can be obtained. The array GPR system has been used to collect time-lapse GPR data, including CMP data during the field infiltration experiment (Iwasaki et al., 2016). CMP data obtained by the array GPR are, however, too sparse to obtain reliable velocity using a standard velocity analysis, such as semblance analysis. We attempted to interpolate the sparse CMP data based on projection onto convex sets (POCS) algorithm (Yi et al., 2016) coupled with NMO correction to automatically determine optimum EM wave velocity. Our previous numerical study showed that the proposed method allows us to determine the EM wave velocity during the infiltration experiment.</p><p>The main objective of this study was to evaluate the performance of the proposed method to interpolate sparse array antenna GPR CMP data collected during the in-situ infiltration experiment at Tottori sand dunes. The interpolated CMP data were then used in the semblance analysis to determine the EM wave velocity, which was further used to compute the infiltration front depth. The estimated infiltration depths agreed well with independently obtained depths. This study demonstrated the possibility of developing an automatic velocity analysis based on POCS interpolation coupled with NMO correction for sparse CMP collected with array antenna GPR.</p>


2021 ◽  
Author(s):  
Boxiao Li ◽  
Hemant Phale ◽  
Yanfen Zhang ◽  
Timothy Tokar ◽  
Xian-Huan Wen

Abstract Design of Experiments (DoE) is one of the most commonly employed techniques in the petroleum industry for Assisted History Matching (AHM) and uncertainty analysis of reservoir production forecasts. Although conceptually straightforward, DoE is often misused by practitioners because many of its statistical and modeling principles are not carefully followed. Our earlier paper (Li et al. 2019) detailed the best practices in DoE-based AHM for brownfields. However, to our best knowledge, there is a lack of studies that summarize the common caveats and pitfalls in DoE-based production forecast uncertainty analysis for greenfields and history-matched brownfields. Our objective here is to summarize these caveats and pitfalls to help practitioners apply the correct principles for DoE-based production forecast uncertainty analysis. Over 60 common pitfalls in all stages of a DoE workflow are summarized. Special attention is paid to the following critical project transitions: (1) the transition from static earth modeling to dynamic reservoir simulation; (2) from AHM to production forecast; and (3) from analyzing subsurface uncertainties to analyzing field-development alternatives. Most pitfalls can be avoided by consistently following the statistical and modeling principles. Some pitfalls, however, can trap experienced engineers. For example, mistakes made in handling the three abovementioned transitions can yield strongly unreliable proxy and sensitivity analysis. For the representative examples we study, they can lead to having a proxy R2 of less than 0.2 versus larger than 0.9 if done correctly. Two improved experimental designs are created to resolve this challenge. Besides the technical pitfalls that are avoidable via robust statistical workflows, we also highlight the often more severe non-technical pitfalls that cannot be evaluated by measures like R2. Thoughts are shared on how they can be avoided, especially during project framing and the three critical transition scenarios.


2021 ◽  
Author(s):  
Changqing Yao ◽  
Hongquan Chen ◽  
Akhil Datta-Gupta ◽  
Sanjay Mawalkar ◽  
Srikanta Mishra ◽  
...  

Abstract Geologic CO2 sequestration and CO2 enhanced oil recovery (EOR) have received significant attention from the scientific community as a response to climate change from greenhouse gases. Safe and efficient management of a CO2 injection site requires spatio-temporal tracking of the CO2 plume in the reservoir during geologic sequestration. The goal of this paper is to develop robust modeling and monitoring technologies for imaging and visualization of the CO2 plume using routine pressure/temperature measurements. The streamline-based technology has proven to be effective and efficient for reconciling geologic models to various types of reservoir dynamic response. In this paper, we first extend the streamline-based data integration approach to incorporate distributed temperature sensor (DTS) data using the concept of thermal tracer travel time. Then, a hierarchical workflow composed of evolutionary and streamline methods is employed to jointly history match the DTS and pressure data. Finally, CO2 saturation and streamline maps are used to visualize the CO2 plume movement during the sequestration process. The power and utility of our approach are demonstrated using both synthetic and field applications. We first validate the streamline-based DTS data inversion using a synthetic example. Next, the hierarchical workflow is applied to a carbon sequestration project in a carbonate reef reservoir within the Northern Niagaran Pinnacle Reef Trend in Michigan, USA. The monitoring data set consists of distributed temperature sensing (DTS) data acquired at the injection well and a monitoring well, flowing bottom-hole pressure data at the injection well, and time-lapse pressure measurements at several locations along the monitoring well. The history matching results indicate that the CO2 movement is mostly restricted to the intended zones of injection which is consistent with an independent warmback analysis of the temperature data. The novelty of this work is the streamline-based history matching method for the DTS data and its field application to the Department of Engergy regional carbon sequestration project in Michigan.


Sign in / Sign up

Export Citation Format

Share Document