Dynamic Modeling of the Gas Discharge of a Mine in the Karaganda Coal Basin Under High Uncertainty Using a Multiple Realization Approach

2021 ◽  
Author(s):  
Asfandiyar Bigeldiyev ◽  
Assem Batu ◽  
Aidynbek Berdibekov ◽  
Dmitry Kovyazin ◽  
Dmitry Sidorov ◽  
...  

Abstract The current work is intended to show the application of a multiple realization approach to produce a strategic development plan for one of the mines in Karaganda coal basin. The presented workflow suggests using a comprehensive reservoir simulator for a history matching process of a coal pillars on a detailed 3D grid and application of sensitivity and uncertainty analyses to produce probabilistic forecast. The suggested workflow significantly differs from the standard approaches previously implemented in the Karaganda Basin. First, a dynamic model has been constructed based on integrated algorithm of petrophysical interpretation and full cycle of geological modeling. Secondly, for the first time in the region, dynamic modeling has been performed via a combination of history matching to the observed degassing data and multiple realization uncertainty analysis. Thirdly, the described model parameters with defined range of uncertainty has been incorporated into the forecasting of degassing efficiency in the mine using different well completion technology. From the hydrodynamic modeling point of view, the coal seam gas (CSG) reservoir is presented as a dual porosity medium: a coal matrix containing adsorbed gas and a network of natural fractures (cleats), which are initially saturated with water. This approach has allowed the proper description of dynamic processes occurring in CSG reservoirs. The gas production from a coal is subject to gas diffusion in coal micropores, the degree of fracture intensity and fracture permeability. By tuning these parameters within reasonable ranges, we have been able to history match our model to the observed data. Moreover, application of an uncertainty analysis has resulted in a range of output parameters (P10, P50, and P90) that were historically observed. Performed full cycle of CSG dynamic modelling including history matching, sensitivity, and uncertainty analyses has been performed to create a robust model with the predictive power. Based on the obtained results, different optimization technologies have been simulated for fast and efficient degassing through a multiple realization probabilistic approach. The coal reservoir presented in this work is characterized by very low effective permeability and final degassing efficiency depends on well-reservoir contact surface. The decrease of the well spacing led to a proportional increase of gas recovery which is very similar to unconventional reservoirs. Therefore, vertical and horizontal wells with hydraulic fractures have been concluded the most efficient way to develop coal seams with low effective permeability in a secondary medium.

1996 ◽  
Vol 33 (2) ◽  
pp. 79-90 ◽  
Author(s):  
Jian Hua Lei ◽  
Wolfgang Schilling

Physically-based urban rainfall-runoff models are mostly applied without parameter calibration. Given some preliminary estimates of the uncertainty of the model parameters the associated model output uncertainty can be calculated. Monte-Carlo simulation followed by multi-linear regression is used for this analysis. The calculated model output uncertainty can be compared to the uncertainty estimated by comparing model output and observed data. Based on this comparison systematic or spurious errors can be detected in the observation data, the validity of the model structure can be confirmed, and the most sensitive parameters can be identified. If the calculated model output uncertainty is unacceptably large the most sensitive parameters should be calibrated to reduce the uncertainty. Observation data for which systematic and/or spurious errors have been detected should be discarded from the calibration data. This procedure is referred to as preliminary uncertainty analysis; it is illustrated with an example. The HYSTEM program is applied to predict the runoff volume from an experimental catchment with a total area of 68 ha and an impervious area of 20 ha. Based on the preliminary uncertainty analysis, for 7 of 10 events the measured runoff volume is within the calculated uncertainty range, i.e. less than or equal to the calculated model predictive uncertainty. The remaining 3 events include most likely systematic or spurious errors in the observation data (either in the rainfall or the runoff measurements). These events are then discarded from further analysis. After calibrating the model the predictive uncertainty of the model is estimated.


Energies ◽  
2020 ◽  
Vol 13 (17) ◽  
pp. 4290
Author(s):  
Dongmei Zhang ◽  
Yuyang Zhang ◽  
Bohou Jiang ◽  
Xinwei Jiang ◽  
Zhijiang Kang

Reservoir history matching is a well-known inverse problem for production prediction where enormous uncertain reservoir parameters of a reservoir numerical model are optimized by minimizing the misfit between the simulated and history production data. Gaussian Process (GP) has shown promising performance for assisted history matching due to the efficient nonparametric and nonlinear model with few model parameters to be tuned automatically. Recently introduced Gaussian Processes proxy models and Variogram Analysis of Response Surface-based sensitivity analysis (GP-VARS) uses forward and inverse Gaussian Processes (GP) based proxy models with the VARS-based sensitivity analysis to optimize the high-dimensional reservoir parameters. However, the inverse GP solution (GPIS) in GP-VARS are unsatisfactory especially for enormous reservoir parameters where the mapping from low-dimensional misfits to high-dimensional uncertain reservoir parameters could be poorly modeled by GP. To improve the performance of GP-VARS, in this paper we propose the Gaussian Processes proxy models with Latent Variable Models and VARS-based sensitivity analysis (GPLVM-VARS) where Gaussian Processes Latent Variable Model (GPLVM)-based inverse solution (GPLVMIS) instead of GP-based GPIS is provided with the inputs and outputs of GPIS reversed. The experimental results demonstrate the effectiveness of the proposed GPLVM-VARS in terms of accuracy and complexity. The source code of the proposed GPLVM-VARS is available at https://github.com/XinweiJiang/GPLVM-VARS.


Author(s):  
Geir Evensen

AbstractIt is common to formulate the history-matching problem using Bayes’ theorem. From Bayes’, the conditional probability density function (pdf) of the uncertain model parameters is proportional to the prior pdf of the model parameters, multiplied by the likelihood of the measurements. The static model parameters are random variables characterizing the reservoir model while the observations include, e.g., historical rates of oil, gas, and water produced from the wells. The reservoir prediction model is assumed perfect, and there are no errors besides those in the static parameters. However, this formulation is flawed. The historical rate data only approximately represent the real production of the reservoir and contain errors. History-matching methods usually take these errors into account in the conditioning but neglect them when forcing the simulation model by the observed rates during the historical integration. Thus, the model prediction depends on some of the same data used in the conditioning. The paper presents a formulation of Bayes’ theorem that considers the data dependency of the simulation model. In the new formulation, one must update both the poorly known model parameters and the rate-data errors. The result is an improved posterior ensemble of prediction models that better cover the observations with more substantial and realistic uncertainty. The implementation accounts correctly for correlated measurement errors and demonstrates the critical role of these correlations in reducing the update’s magnitude. The paper also shows the consistency of the subspace inversion scheme by Evensen (Ocean Dyn. 54, 539–560 2004) in the case with correlated measurement errors and demonstrates its accuracy when using a “larger” ensemble of perturbations to represent the measurement error covariance matrix.


2021 ◽  
Author(s):  
Guohua Gao ◽  
Jeroen Vink ◽  
Fredrik Saaf ◽  
Terence Wells

Abstract When formulating history matching within the Bayesian framework, we may quantify the uncertainty of model parameters and production forecasts using conditional realizations sampled from the posterior probability density function (PDF). It is quite challenging to sample such a posterior PDF. Some methods e.g., Markov chain Monte Carlo (MCMC), are very expensive (e.g., MCMC) while others are cheaper but may generate biased samples. In this paper, we propose an unconstrained Gaussian Mixture Model (GMM) fitting method to approximate the posterior PDF and investigate new strategies to further enhance its performance. To reduce the CPU time of handling bound constraints, we reformulate the GMM fitting formulation such that an unconstrained optimization algorithm can be applied to find the optimal solution of unknown GMM parameters. To obtain a sufficiently accurate GMM approximation with the lowest number of Gaussian components, we generate random initial guesses, remove components with very small or very large mixture weights after each GMM fitting iteration and prevent their reappearance using a dedicated filter. To prevent overfitting, we only add a new Gaussian component if the quality of the GMM approximation on a (large) set of blind-test data sufficiently improves. The unconstrained GMM fitting method with the new strategies proposed in this paper is validated using nonlinear toy problems and then applied to a synthetic history matching example. It can construct a GMM approximation of the posterior PDF that is comparable to the MCMC method, and it is significantly more efficient than the constrained GMM fitting formulation, e.g., reducing the CPU time by a factor of 800 to 7300 for problems we tested, which makes it quite attractive for large scale history matching problems.


2021 ◽  
Author(s):  
Boxiao Li ◽  
Hemant Phale ◽  
Yanfen Zhang ◽  
Timothy Tokar ◽  
Xian-Huan Wen

Abstract Design of Experiments (DoE) is one of the most commonly employed techniques in the petroleum industry for Assisted History Matching (AHM) and uncertainty analysis of reservoir production forecasts. Although conceptually straightforward, DoE is often misused by practitioners because many of its statistical and modeling principles are not carefully followed. Our earlier paper (Li et al. 2019) detailed the best practices in DoE-based AHM for brownfields. However, to our best knowledge, there is a lack of studies that summarize the common caveats and pitfalls in DoE-based production forecast uncertainty analysis for greenfields and history-matched brownfields. Our objective here is to summarize these caveats and pitfalls to help practitioners apply the correct principles for DoE-based production forecast uncertainty analysis. Over 60 common pitfalls in all stages of a DoE workflow are summarized. Special attention is paid to the following critical project transitions: (1) the transition from static earth modeling to dynamic reservoir simulation; (2) from AHM to production forecast; and (3) from analyzing subsurface uncertainties to analyzing field-development alternatives. Most pitfalls can be avoided by consistently following the statistical and modeling principles. Some pitfalls, however, can trap experienced engineers. For example, mistakes made in handling the three abovementioned transitions can yield strongly unreliable proxy and sensitivity analysis. For the representative examples we study, they can lead to having a proxy R2 of less than 0.2 versus larger than 0.9 if done correctly. Two improved experimental designs are created to resolve this challenge. Besides the technical pitfalls that are avoidable via robust statistical workflows, we also highlight the often more severe non-technical pitfalls that cannot be evaluated by measures like R2. Thoughts are shared on how they can be avoided, especially during project framing and the three critical transition scenarios.


2019 ◽  
Vol 23 (6) ◽  
pp. 1331-1347 ◽  
Author(s):  
Miguel Alfonzo ◽  
Dean S. Oliver

Abstract It is common in ensemble-based methods of history matching to evaluate the adequacy of the initial ensemble of models through visual comparison between actual observations and data predictions prior to data assimilation. If the model is appropriate, then the observed data should look plausible when compared to the distribution of realizations of simulated data. The principle of data coverage alone is, however, not an effective method for model criticism, as coverage can often be obtained by increasing the variability in a single model parameter. In this paper, we propose a methodology for determining the suitability of a model before data assimilation, particularly aimed for real cases with large numbers of model parameters, large amounts of data, and correlated observation errors. This model diagnostic is based on an approximation of the Mahalanobis distance between the observations and the ensemble of predictions in high-dimensional spaces. We applied our methodology to two different examples: a Gaussian example which shows that our shrinkage estimate of the covariance matrix is a better discriminator of outliers than the pseudo-inverse and a diagonal approximation of this matrix; and an example using data from the Norne field. In this second test, we used actual production, repeat formation tester, and inverted seismic data to evaluate the suitability of the initial reservoir simulation model and seismic model. Despite the good data coverage, our model diagnostic suggested that model improvement was necessary. After modifying the model, it was validated against the observations and is now ready for history matching to production and seismic data. This shows that the proposed methodology for the evaluation of the adequacy of the model is suitable for large realistic problems.


SPE Journal ◽  
2020 ◽  
Vol 25 (02) ◽  
pp. 951-968 ◽  
Author(s):  
Minjie Lu ◽  
Yan Chen

Summary Owing to the complex nature of hydrocarbon reservoirs, the numerical model constructed by geoscientists is always a simplified version of reality: for example, it might lack resolution from discretization and lack accuracy in modeling some physical processes. This flaw in the model that causes mismatch between actual observations and simulated data when “perfect” model parameters are used as model inputs is known as “model error”. Even in a situation when the model is a perfect representation of reality, the inputs to the model are never completely known. During a typical model calibration procedure, only a subset of model inputs is adjusted to improve the agreement between model responses and historical data. The remaining model inputs that are not calibrated and are likely fixed at incorrect values result in model error in a similar manner as the imperfect model scenario. Assimilation of data without accounting for model error can result in the incorrect adjustment to model parameters, the underestimation of prediction uncertainties, and bias in forecasts. In this paper, we investigate the benefit of recognizing and accounting for model error when an iterative ensemble smoother is used to assimilate production data. The correlated “total error” (a combination of model error and observation error) is estimated from the data residual after a standard history-matching using the Levenberg-Marquardt form of iterative ensemble smoother (LM-EnRML). This total error is then used in further data assimilations to improve the estimation of model parameters and quantification of prediction uncertainty. We first illustrate the method using a synthetic 2D five-spot example, where some model errors are deliberately introduced, and the results are closely examined against the known “true” model. Then, the Norne field case is used to further evaluate the method. The Norne model has previously been history-matched using the LM-EnRML (Chen and Oliver 2014), where cell-by-cell properties (permeability, porosity, net-to-gross, vertical transmissibility) and parameters related to fault transmissibility, depths of water/oil contacts, and relative permeability function are adjusted to honor historical data. In this previous study, the authors highlighted the importance of including large amounts of model parameters, the proper use of localization, and heuristic adjustment of data noise to account for modeling error. In this paper, we improve the last aspect by quantitatively estimating model error using residual analysis.


2015 ◽  
Vol 18 (04) ◽  
pp. 481-494 ◽  
Author(s):  
Siavash Nejadi ◽  
Juliana Y. Leung ◽  
Japan J. Trivedi ◽  
Claudio Virues

Summary Advancements in horizontal-well drilling and multistage hydraulic fracturing have enabled economically viable gas production from tight formations. Reservoir-simulation models play an important role in the production forecasting and field-development planning. To enhance their predictive capabilities and to capture the uncertainties in model parameters, one should calibrate stochastic reservoir models to both geologic and flow observations. In this paper, a novel approach to characterization and history matching of hydrocarbon production from a hydraulic-fractured shale is presented. This new methodology includes generating multiple discrete-fracture-network (DFN) models, upscaling the models for numerical multiphase-flow simulation, and updating the DFN-model parameters with dynamic-flow responses. First, measurements from hydraulic-fracture treatment, petrophysical interpretation, and in-situ stress data are used to estimate the initial probability distribution of hydraulic-fracture and induced-microfracture parameters, and multiple initial DFN models are generated. Next, the DFN models are upscaled into an equivalent continuum dual-porosity model with analytical techniques. The upscaled models are subjected to the flow simulation, and their production performances are compared with the actual responses. Finally, an assisted-history-matching algorithm is implemented to assess the uncertainties of the DFN-model parameters. Hydraulic-fracture parameters including half-length and transmissivity are updated, and the length, transmissivity, intensity, and spatial distribution of the induced fractures are also estimated. The proposed methodology is applied to facilitate characterization of fracture parameters of a multifractured shale-gas well in the Horn River basin. Fracture parameters and stimulated reservoir volume (SRV) derived from the updated DFN models are in agreement with estimates from microseismic interpretation and rate-transient analysis. The key advantage of this integrated assisted-history-matching approach is that uncertainties in fracture parameters are represented by the multiple equally probable DFN models and their upscaled flow-simulation models, which honor the hard data and match the dynamic production history. This work highlights the significance of uncertainties in SRV and hydraulic-fracture parameters. It also provides insight into the value of microseismic data when integrated into a rigorous production-history-matching work flow.


Sign in / Sign up

Export Citation Format

Share Document