Global-Search Distributed-Gauss-Newton Optimization Method and Its Integration With the Randomized-Maximum-Likelihood Method for Uncertainty Quantification of Reservoir Performance

SPE Journal ◽  
2018 ◽  
Vol 23 (05) ◽  
pp. 1496-1517 ◽  
Author(s):  
Chaohui Chen ◽  
Guohua Gao ◽  
Ruijian Li ◽  
Richard Cao ◽  
Tianhong Chen ◽  
...  

Summary Although it is possible to apply traditional optimization algorithms together with the randomized-maximum-likelihood (RML) method to generate multiple conditional realizations, the computation cost is high. This paper presents a novel method to enhance the global-search capability of the distributed-Gauss-Newton (DGN) optimization method and integrates it with the RML method to generate multiple realizations conditioned to production data synchronously. RML generates samples from an approximate posterior by minimizing a large ensemble of perturbed objective functions in which the observed data and prior mean values of uncertain model parameters have been perturbed with Gaussian noise. Rather than performing these minimizations in isolation using large sets of simulations to evaluate the finite-difference approximations of the gradients used to optimize each perturbed realization, we use a concurrent implementation in which simulation results are shared among different minimization tasks whenever these results are helping to converge to the global minimum of a specific minimization task. To improve sharing of results, we relax the accuracy of the finite-difference approximations for the gradients with more widely spaced simulation results. To avoid trapping in local optima, a novel method to enhance the global-search capability of the DGN algorithm is developed and integrated seamlessly with the RML formulation. In this way, we can improve the quality of RML conditional realizations that sample the approximate posterior. The proposed work flow is first validated with a toy problem and then applied to a real-field unconventional asset. Numerical results indicate that the new method is very efficient compared with traditional methods. Hundreds of data-conditioned realizations can be generated in parallel within 20 to 40 iterations. The computational cost (central-processing-unit usage) is reduced significantly compared with the traditional RML approach. The real-field case studies involve a history-matching study to generate history-matched realizations with the proposed method and an uncertainty quantification of production forecasting using those conditioned models. All conditioned models generate production forecasts that are consistent with real-production data in both the history-matching period and the blind-test period. Therefore, the new approach can enhance the confidence level of the estimated-ultimate-recovery (EUR) assessment using production-forecasting results generated from all conditional realizations, resulting in significant business impact.

2005 ◽  
Vol 8 (03) ◽  
pp. 214-223 ◽  
Author(s):  
Fengjun Zhang ◽  
Jan-Arild Skjervheim ◽  
Albert C. Reynolds ◽  
Dean S. Oliver

Summary The Bayesian framework allows one to integrate production and static data into an a posteriori probability density function (pdf) for reservoir variables(model parameters). The problem of generating realizations of the reservoir variables for the assessment of uncertainty in reservoir description or predicted reservoir performance then becomes a problem of sampling this a posteriori pdf to obtain a suite of realizations. Generation of a realization by the randomized-maximum-likelihood method requires the minimization of an objective function that includes production-data misfit terms and a model misfit term that arises from a prior model constructed from static data. Minimization of this objective function with an optimization algorithm is equivalent to the automatic history matching of production data, with a prior model constructed from static data providing regularization. Because of the computational cost of computing sensitivity coefficients and the need to solve matrix problems involving the covariance matrix for the prior model, this approach has not been applied to problems in which the number of data and the number of reservoir-model parameters are both large and the forward problem is solved by a conventional finite-difference simulator. In this work, we illustrate that computational efficiency problems can be overcome by using a scaled limited-memory Broyden-Fletcher-Goldfarb-Shanno (LBFGS) algorithm to minimize the objective function and by using approximate computational stencils to approximate the multiplication of a vector by the prior covariance matrix or its inverse. Implementation of the LBFGS method requires only the gradient of the objective function, which can be obtained from a single solution of the adjoint problem; individual sensitivity coefficients are not needed. We apply the overall process to two examples. The first is a true field example in which a realization of log permeabilities at26,019 gridblocks is generated by the automatic history matching of pressure data, and the second is a pseudo field example that provides a very rough approximation to a North Sea reservoir in which a realization of log permeabilities at 9,750 gridblocks is computed by the automatic history matching of gas/oil ratio (GOR) and pressure data. Introduction The Bayes theorem provides a general framework for updating a pdf as new data or information on the model becomes available. The Bayesian setting offers a distinct advantage. If one can generate a suite of realizations that represent a correct sampling of the a posteriori pdf, then the suite of samples provides an assessment of the uncertainty in reservoir variables. Moreover, by predicting future reservoir performance under proposed operating conditions for each realization, one can characterize the uncertainty in future performance predictions by constructing statistics for the set of outcomes. Liu and Oliver have recently presented a comparison of methods for sampling the a posteriori pdf. Their results indicate that the randomized-maximum-likelihood method is adequate for evaluating uncertainty with a relatively limited number of samples. In this work, we consider the case in which a prior geostatistical model constructed from static data is available and is represented by a multivariate Gaussian pdf. Then, the a posteriori pdf conditional to production data is such that calculation of the maximum a posteriori estimate or generation of a realization by the randomized-maximum-likelihood method is equivalent to the minimization of an appropriate objective function. History-matching problems of interest to us involve a few thousand to tens of thousands of reservoir variables and a few hundred to a few thousand production data. Thus, an optimization algorithm suitable for large-scale problems is needed. Our belief is that nongradient-based algorithms such as simulated annealing and the genetic algorithm are not competitive with gradient-based algorithms in terms of computational efficiency. Classical gradient-based algorithms such as the Gauss-Newton and Levenberg-Marquardt typically converge fairly quickly and have been applied successfully to automatic history matching for both single-phase- and multiphase-flow problems. No multiphase-flow example considered in these papers involved more than 1,500reservoir variables. For single-phase-flow problems, He et al. and Reynolds et al. have generated realizations of models involving up to 12,500 reservoir variables by automatic history matching of pressure data. However, they used a procedure based on their generalization of the method of Carter et al. to calculate sensitivity coefficients; this method assumes that the partial-differential equation solved by reservoir simulation is linear and does not apply for multiphase-flow problems.


SPE Journal ◽  
2010 ◽  
Vol 16 (02) ◽  
pp. 307-317 ◽  
Author(s):  
Yanfen Zhang ◽  
Dean S. Oliver

Summary The increased use of optimization in reservoir management has placed greater demands on the application of history matching to produce models that not only reproduce the historical production behavior but also preserve geological realism and quantify forecast uncertainty. Geological complexity and limited access to the subsurface typically result in a large uncertainty in reservoir properties and forecasts. However, there is a systematic tendency to underestimate such uncertainty, especially when rock properties are modeled using Gaussian random fields. In this paper, we address one important source of uncertainty: the uncertainty in regional trends by introducing stochastic trend coefficients. The multiscale parameters including trend coefficients and heterogeneities can be estimated using the ensemble Kalman filter (EnKF) for history matching. Multiscale heterogeneities are often important, especially in deepwater reservoirs, but are generally poorly represented in history matching. In this paper, we describe a method for representing and updating multiple scales of heterogeneity in the EnKF. We tested our method for updating these variables using production data from a deepwater field whose reservoir model has more than 200,000 unknown parameters. The match of reservoir simulator forecasts to real field data using a standard application of EnKF had not been entirely satisfactory because it was difficult to match the water cut of a main producer in the reservoir. None of the realizations of the reservoir exhibited water breakthrough using the standard parameterization method. By adding uncertainty in large-scale trends of reservoir properties, the ability to match the water cut and other production data was improved substantially. The results indicate that an improvement in the generation of the initial ensemble and in the variables describing the property fields gives an improved history match with plausible geology. The multiscale parameterization of property fields reduces the tendency to underestimate uncertainty while still providing reservoir models that match data.


2012 ◽  
Vol 52 (2) ◽  
pp. 648
Author(s):  
Bingxiang Xu ◽  
Manouchehr Haghighi ◽  
D Cooke

Eagle Ford Shale in South Texas is one of the recent shale play in the US, which began developing in late 2008. To evaluate the reservoir performance and make the production forecasting for this reservoir, one multi-stage fractured horizontal well was modelled and history matching was done using the available 250 days of production data. Two different flow models of dual-porosity and multi-porosity have been examined. In the multi-porosity model, both approaches of instant and time-dependent sorption have been investigated. Also, two approaches of negative skin and transverse fractures were used to model the effect of hydraulic fracturing. For history matching of early production data, all the models were successfully matched; however, all models predict differently for production forecasting. Comparing both production forecasts for 10 years, the multi-porosity model forecasts 14% more than dual-porosity model. This is because in the dual-porosity model, only free porosity is considered and no adsorbed gas in micro-pores is assumed; in multi-porosity model, both macro and micro porosities are active in shale gas reservoir. It is concluded that the early production data is not reliable to validate the simulation and make the production forecasting. This is because in early production data, all gas are produced from the fracture system and the matrix contribution is not significant or it has not been started yet. Furthermore, the effect of matrix sub-division on the simulation was studied: the free gas in matrix can contribute to production more quickly when matrix sub-cells increase.


2016 ◽  
Vol 19 (04) ◽  
pp. 683-693 ◽  
Author(s):  
Zhaoqi Fan ◽  
Yin Zhang ◽  
Daoyong Yang

Summary In this paper, a modified ensemble randomized maximum-likelihood (EnRML) algorithm has been developed to estimate three-phase relative permeabilities with consideration of the hysteresis effect by reproducing the actual production data. Ensemble-based history matching uses an ensemble of realizations to construct Monte Carlo approximations of the mean and covariance of the model variables, which can acquire the gradient information from the correlation provided by the ensemble. A power-law model is first used to represent the three-phase relative permeabilities, the coefficients of which can be automatically adjusted until production history is matched. A damping factor is introduced as an adjustment to the step length because a reduced step length is commonly required if an inverse problem is sufficiently nonlinear. A recursive approach for determining the damping factor has been developed to reduce the number of iterations and the computational load of the EnRML algorithm. The restart of reservoir simulations for reducing the cost of reservoir simulations is of significant importance for the EnRML algorithm where iterations are inevitable. By comparing a direct-restart method and an indirect-restart method for numerical simulations, we optimize the restart method used for a specific problem. Subsequently, we validate the proposed methodology by use of a synthetic water-alternating-gas (WAG) displacement experiment and then extend it to match laboratory experiments. The proposed technique has proved to efficiently determine the three-phase relative permeabilities for the WAG processes with consideration of the hysteresis effect, whereas history-matching results are gradually improved as more production data are taken into account. The synthetic scenarios demonstrate that the recursive approach saves 33.7% of the computational expense compared with the trial-and-error method when the maximum iteration is 14. Also, the consistency between the production data and model variables has been well-maintained during the updating processes by use of the direct-restart method, whereas the indirect-restart method fails to minimize the uncertainties associated with the model variables representing three-phase relative permeabilities.


SPE Journal ◽  
2019 ◽  
Vol 24 (04) ◽  
pp. 1452-1467 ◽  
Author(s):  
Rolf J. Lorentzen ◽  
Xiaodong Luo ◽  
Tuhin Bhakta ◽  
Randi Valestrand

Summary In this paper, we use a combination of acoustic impedance and production data for history matching the full Norne Field. The purpose of the paper is to illustrate a robust and flexible work flow for assisted history matching of large data sets. We apply an iterative ensemble-based smoother, and the traditional approach for assisted history matching is extended to include updates of additional parameters representing rock clay content, which has a significant effect on seismic data. Further, for seismic data it is a challenge to properly specify the measurement noise, because the noise level and spatial correlation between measurement noise are unknown. For this purpose, we apply a method based on image denoising for estimating the spatially correlated (colored) noise level in the data. For the best possible evaluation of the workflow performance, all data are synthetically generated in this study. We assimilate production data and seismic data sequentially. First, the production data are assimilated using traditional distance-based localization, and the resulting ensemble of reservoir models is then used when assimilating seismic data. This procedure is suitable for real field applications, because production data are usually available before seismic data. If both production data and seismic data are assimilated simultaneously, the high number of seismic data might dominate the overall history-matching performance. The noise estimation for seismic data involves transforming the observations to a discrete wavelet domain. However, the resulting data do not have a clear spatial position, and the traditional distance-based localization schemes used to avoid spurious correlations and underestimated uncertainty (because of limited ensemble size), are not possible to apply. Instead, we use a localization scheme that is based on correlations between observations and parameters that does not rely on physical position for model variables or data. This method automatically adapts to each observation and iteration. The results show that we reduce data mismatch for both production and seismic data, and that the use of seismic data reduces estimation errors for porosity, permeability, and net-to-gross ratio (NTG). Such improvements can provide useful information for reservoir management and planning for additional drainage strategies.


2016 ◽  
Vol 2016 ◽  
pp. 1-13 ◽  
Author(s):  
Shiqiang Wang ◽  
Jianchun Xing ◽  
Ziyan Jiang ◽  
Juelong Li

A decentralized control structure is introduced into the heating, ventilation, and air conditioning (HVAC) system to solve the high maintenance and labor cost problem in actual engineering. Based on this new control system, a decentralized optimization method is presented for sensor fault repair and optimal group control of HVAC equipment. Convergence property of the novel method is theoretically analyzed considering both convex and nonconvex systems with constraints. In this decentralized control system, traditional device is fitted with a control chip such that it becomes a smart device. The smart device can communicate and operate collaboratively with the other devices to accomplish some designated tasks. The effectiveness of the presented method is verified by simulations and hardware tests.


2013 ◽  
Author(s):  
Marko Maucec ◽  
Ajay Pratap Singh ◽  
Gustavo A Carvajal ◽  
Seyed Mohammad Mirzadeh ◽  
Steven Patton Knabe ◽  
...  

2021 ◽  
Author(s):  
Elizabeth Ruiz ◽  
Brandon Thibodeaux ◽  
Christopher Dorion ◽  
Herman Mukisa ◽  
Majid Faskhoodi ◽  
...  

Abstract Optimized geomodeling and history matching of production data is presented by utilizing an integrated rock and fluid workflow. Facies identification is performed by use of image logs and other geological information. In addition, image logs are used to help define structural geodynamic processes that occurred in the reservoir. Methods of reservoir fluid geodynamics are used to assess the extent of fluid compositional equilibrium, especially the asphaltenes, and thereby the extent of connectivity in these facies. Geochemical determinations are shown to be consistent with measurements of compositional thermodynamic equilibrium. The ability to develop the geo-scenario of the reservoir, the coherent evolution of rock and contained fluids in the reservoir over geologic time, improves the robustness of the geomodel. In particular, the sequence of oil charge, compositional equilibrium, fault block throw, and primary biogenic gas charge are established in this middle Pliocene reservoir with implications for production, field extension,and local basin exploration. History matching of production data prove the accuracy of the geomodel; nevertheless, refinements to the geomodel and improved history matching were obtained by expanded deterministic property estimation from wireline log and other data. Theearly connection of fluid data, both thermodynamic and geochemical, with relevant facies andtheir properties determination enables a more facile method to incorporate this data into the geomodel. Logging data from future wells in the field can be imported into the geomodel allowingdeterministic optimization of this model long after production has commenced. While each reservoir is unique with its own idiosyncrasies, the workflow presented here is generally applicable to all reservoirs and always improves reservoir understanding.


Sign in / Sign up

Export Citation Format

Share Document