Improving Forecast Uncertainty Quantification by Incorporating Production History and Using a Modified Ranking Method of Geostatistical Realizations

2020 ◽  
Vol 142 (9) ◽  
Author(s):  
Kazem Monfaredi ◽  
Mohammad Emami Niri ◽  
Behnam Sedaee

Abstract The majority of the geostatistical realizations ranking methods disregard the production history in selection of realizations, due to its requirement of high simulation run time. They also ignore to consider the degree of linear relationship between the “ranks based on the ranking measure” and “ranks based on the performance parameter” in choosing the employed ranking measure. To address these concerns, we propose an uncertainty quantification workflow, which includes two sequential stages of history matching and realization selection. In the first stage, production data are incorporated in the uncertainty quantification procedure through a history matching process. A fast simulator is employed to find the realizations with consistent flow behavior with the production history data in shorter time, compared to a comprehensive simulator. The selected realizations are the input of the second stage of the workflow, which can be any type of the realization selection method. In this study, we used the most convenient realization selection method, i.e., ranking of the realizations. To select the most efficient ranking measure, we investigated the degree of the linear correlation between the ranks based on the several ranking measures and the performance parameter. In addition, due to the shortcomings of the traditional ranking methods in uncertainty quantiles identification, a modified ranking method is introduced. This modification increases the certainty in the probability of the selected realizations. The obtained results on 3D close-to-real synthetic reservoir models revealed the capability of the modified ranking method in more accurate quantification of the uncertainty in reservoir performance prediction.

2011 ◽  
Vol 14 (05) ◽  
pp. 621-633 ◽  
Author(s):  
Alireza Kazemi ◽  
Karl D. Stephen ◽  
Asghar Shams

Summary History matching of a reservoir model is always a difficult task. In some fields, we can use time-lapse (4D) seismic data to detect production-induced changes as a complement to more conventional production data. In seismic history matching, we predict these data and compare to observations. Observed time-lapse data often consist of relative measures of change, which require normalization. We investigate different normalization approaches, based on predicted 4D data, and assess their impact on history matching. We apply the approach to the Nelson field in which four surveys are available over 9 years of production. We normalize the 4D signature in a number of ways. First, we use predictions of 4D signature from vertical wells that match production, and we derive a normalization function. As an alternative, we use crossplots of the full-field prediction against observation. Normalized observations are used in an automatic-history-matching process, in which the model is updated. We analyze the results of the two normalization approaches and compare against the case of just using production data. The result shows that when we use 4D data normalized to wells, we obtain 49% reduced misfit along with 36% improvement in predictions. Also over the whole reservoir, 8 and 7% reduction of misfits for 4D seismic are obtained in history and prediction periods, respectively. When we use only production data, the production history match is improved to a similar degree (45%), but in predictions, the improvement is only 25% and the 4D seismic misfit is 10% worse. Finding the unswept areas in the reservoir is always a challenge in reservoir management. By using 4D data in history matching, we can better predict reservoir behavior and identify regions of remaining oil.


SPE Journal ◽  
2017 ◽  
Vol 22 (05) ◽  
pp. 1506-1518 ◽  
Author(s):  
Pedram Mahzari ◽  
Mehran Sohrabi

Summary Three-phase flow in porous media during water-alternating-gas (WAG) injections and the associated cycle-dependent hysteresis have been subject of studies experimentally and theoretically. In spite of attempts to develop models and simulation methods for WAG injections and three-phase flow, current lack of a solid approach to handle hysteresis effects in simulating WAG-injection scenarios has resulted in misinterpretations of simulation outcomes in laboratory and field scales. In this work, by use of our improved methodology, the first cycle of the WAG experiments (first waterflood and the subsequent gasflood) was history matched to estimate the two-phase krs (oil/water and gas/oil). For subsequent cycles, pertinent parameters of the WAG hysteresis model are included in the automatic-history-matching process to reproduce all WAG cycles together. The results indicate that history matching the whole WAG experiment would lead to a significantly improved simulation outcome, which highlights the importance of two elements in evaluating WAG experiments: inclusion of the full WAG experiments in history matching and use of a more-representative set of two-phase krs, which was originated from our new methodology to estimate two-phase krs from the first cycle of a WAG experiment. Because WAG-related parameters should be able to model any three-phase flow irrespective of WAG scenarios, in another exercise, the tuned parameters obtained from a WAG experiment (starting with water) were used in a similar coreflood test (WAG starting with gas) to assess predictive capability for simulating three-phase flow in porous media. After identifying shortcomings of existing models, an improved methodology was used to history match multiple coreflood experiments simultaneously to estimate parameters that can reasonably capture processes taking place in WAG at different scenarios—that is, starting with water or gas. The comprehensive simulation study performed here would shed some light on a consolidated methodology to estimate saturation functions that can simulate WAG injections at different scenarios.


2013 ◽  
Vol 50 ◽  
pp. 4-15 ◽  
Author(s):  
D. Arnold ◽  
V. Demyanov ◽  
D. Tatum ◽  
M. Christie ◽  
T. Rojas ◽  
...  

Energies ◽  
2021 ◽  
Vol 14 (6) ◽  
pp. 1557
Author(s):  
Amine Tadjer ◽  
Reidar B. Bratvold

Carbon capture and storage (CCS) has been increasingly looking like a promising strategy to reduce CO2 emissions and meet the Paris agreement’s climate target. To ensure that CCS is safe and successful, an efficient monitoring program that will prevent storage reservoir leakage and drinking water contamination in groundwater aquifers must be implemented. However, geologic CO2 sequestration (GCS) sites are not completely certain about the geological properties, which makes it difficult to predict the behavior of the injected gases, CO2 brine leakage rates through wellbores, and CO2 plume migration. Significant effort is required to observe how CO2 behaves in reservoirs. A key question is: Will the CO2 injection and storage behave as expected, and can we anticipate leakages? History matching of reservoir models can mitigate uncertainty towards a predictive strategy. It could prove challenging to develop a set of history matching models that preserve geological realism. A new Bayesian evidential learning (BEL) protocol for uncertainty quantification was released through literature, as an alternative to the model-space inversion in the history-matching approach. Consequently, an ensemble of previous geological models was developed using a prior distribution’s Monte Carlo simulation, followed by direct forecasting (DF) for joint uncertainty quantification. The goal of this work is to use prior models to identify a statistical relationship between data prediction, ensemble models, and data variables, without any explicit model inversion. The paper also introduces a new DF implementation using an ensemble smoother and shows that the new implementation can make the computation more robust than the standard method. The Utsira saline aquifer west of Norway is used to exemplify BEL’s ability to predict the CO2 mass and leakages and improve decision support regarding CO2 storage projects.


2021 ◽  
Vol 73 (04) ◽  
pp. 60-61
Author(s):  
Chris Carpenter

This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17–19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17-19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction The concept of rate transient analysis (RTA) involves the use of rate and pressure trends of producing wells to estimate properties such as permeability and fracture surface area. While very useful, RTA is an analytical technique and has commensurate limitations. In the complete paper, different RTA motifs are generated using a simulator. Insights from these motif simulations are used to modify simulation parameters to expediate and inform the history- matching process. The simulation history-matching work flow presented includes the following steps: 1 - Set up a simulation model with geologic properties, wellbore and completion designs, and fracturing and production schedules 2 - Run an initial model 3 - Tune the fracture geometries (height and length) to heuristic data: microseismic, frac-hit data, distributed acoustic sensing, or other diagnostics 4 - Match instantaneous shut-in pressure (ISIP) and wellhead pressure (WHP) during injection 5 - Make RTA plots of the real and simulated production data 6 - Use the motifs presented in the paper to identify possible production mechanisms in the real data 7 - Adjust history-matching parameters in the simulation model based on the intuition gained from RTA of the real data 8 -Iterate Steps 5 through 7 to obtain a match in RTA trends 9 - Modify relative permeabilities as necessary to obtain correct oil, water, and gas proportions In this study, the authors used a commercial simulator that fully integrates hydraulic fracturing, wellbore, and reservoir simulation into a single modeling code. Matching Fracturing Data The complete paper focuses on matching production data, assisted by RTA, not specifically on the matching of fracturing data such as injection pressure and fracture geometry (Steps 3 and 4). Nevertheless, for completeness, these steps are very briefly summarized in this section. Effective fracture toughness is the most-important factor in determining fracture length. Field diagnostics suggest considerable variability in effective fracture toughness and fracture length. Typical half-lengths are between 500 and 2,000 ft. Laboratory-derived values of fracture toughness yield longer fractures (propagation of 2,000 ft or more from the wellbore). Significantly larger values of fracture toughness are needed to explain the shorter fracture length and higher net pressure values that are often observed. The authors use a scale- dependent fracture-toughness parameter to increase toughness as the fracture grows. This allows the simulator to match injection pressure data while simultaneously limiting fracture length. This scale-dependent toughness scaling parameter is the most-important parameter in determining fracture size.


2021 ◽  
pp. 1-29
Author(s):  
Eric Sonny Mathew ◽  
Moussa Tembely ◽  
Waleed AlAmeri ◽  
Emad W. Al-Shalabi ◽  
Abdul Ravoof Shaik

Two of the most critical properties for multiphase flow in a reservoir are relative permeability (Kr) and capillary pressure (Pc). To determine these parameters, careful interpretation of coreflooding and centrifuge experiments is necessary. In this work, a machine learning (ML) technique was incorporated to assist in the determination of these parameters quickly and synchronously for steady-state drainage coreflooding experiments. A state-of-the-art framework was developed in which a large database of Kr and Pc curves was generated based on existing mathematical models. This database was used to perform thousands of coreflood simulation runs representing oil-water drainage steady-state experiments. The results obtained from the corefloods including pressure drop and water saturation profile, along with other conventional core analysis data, were fed as features into the ML model. The entire data set was split into 70% for training, 15% for validation, and the remaining 15% for the blind testing of the model. The 70% of the data set for training teaches the model to capture fluid flow behavior inside the core, and then 15% of the data set was used to validate the trained model and to optimize the hyperparameters of the ML algorithm. The remaining 15% of the data set was used for testing the model and assessing the model performance scores. In addition, K-fold split technique was used to split the 15% testing data set to provide an unbiased estimate of the final model performance. The trained/tested model was thereby used to estimate Kr and Pc curves based on available experimental results. The values of the coefficient of determination (R2) were used to assess the accuracy and efficiency of the developed model. The respective crossplots indicate that the model is capable of making accurate predictions with an error percentage of less than 2% on history matching experimental data. This implies that the artificial-intelligence- (AI-) based model is capable of determining Kr and Pc curves. The present work could be an alternative approach to existing methods for interpreting Kr and Pc curves. In addition, the ML model can be adapted to produce results that include multiple options for Kr and Pc curves from which the best solution can be determined using engineering judgment. This is unlike solutions from some of the existing commercial codes, which usually provide only a single solution. The model currently focuses on the prediction of Kr and Pc curves for drainage steady-state experiments; however, the work can be extended to capture the imbibition cycle as well.


2021 ◽  
Author(s):  
Ali Al-Turki ◽  
Obai Alnajjar ◽  
Majdi Baddourah ◽  
Babatunde Moriwawon

Abstract The algorithms and workflows have been developed to couple efficient model parameterization with stochastic, global optimization using a Multi-Objective Genetic Algorithm (MOGA) for global history matching, and coupled with an advanced workflow for streamline sensitivity-based inversion for fine-tuning. During parameterization the low-rank subsets of most influencing reservoir parameters are identified and propagated to MOGA to perform the field-level history match. Data misfits between the field historical data and simulation data are calculated with multiple realizations of reservoir models that quantify and capture reservoir uncertainty. Each generation of the optimization algorithms reduces the data misfit relative to the previous iteration. This iterative process continues until a satisfactory field-level history match is reached or there are no further improvements. The fine-tuning process of well-connectivity calibration is then performed with a streamlined sensitivity-based inversion algorithm to locally update the model to reduce well-level mismatch. In this study, an application of the proposed algorithms and workflow is demonstrated for model calibration and history matching. The synthetic reservoir model used in this study is discretized into millions of grid cells with hundreds of producer and injector wells. It is designed to generate several decades of production and injection history to evaluate and demonstrate the workflow. In field-level history matching, reservoir rock properties (e.g., permeability, fault transmissibility, etc.) are parameterized to conduct the global match of pressure and production rates. Grid Connectivity Transform (GCT) was used and assessed to parameterize the reservoir properties. In addition, the convergence rate and history match quality of MOGA was assessed during the field (global) history matching. Also, the effectiveness of the streamline-based inversion was evaluated by quantifying the additional improvement in history matching quality per well. The developed parametrization and optimization algorithms and workflows revealed the unique features of each of the algorithms for model calibration and history matching. This integrated workflow has successfully defined and carried uncertainty throughout the history matching process. Following the successful field-level history match, the well-level history matching was conducted using streamline sensitivity-based inversion, which further improved the history match quality and conditioned the model to historical production and injection data. In general, the workflow results in enhanced history match quality in a shorter turnaround time. The geological realism of the model is retained for robust prediction and development planning.


Sign in / Sign up

Export Citation Format

Share Document