Using Gradients To Refine Parameterization in Field-Case History-Matching Projects

2007 ◽  
Vol 10 (03) ◽  
pp. 233-240 ◽  
Author(s):  
Alberto Cominelli ◽  
Fabrizio Ferdinandi ◽  
Pierre Claude de Montleau ◽  
Roberto Rossi

Summary Reservoir management is based on the prediction of reservoir performance by means of numerical-simulation models. Reliable predictions require that the numerical model mimic the production history. Therefore, the numerical model is modified to match the production data. This process is termed history matching (HM). Form a mathematical viewpoint, HM is an optimization problem, where the target is to minimize an objective function quantifying the misfit between observed and simulated production data. One of the main problems in HM is the choice of an effective parameterization—a set of reservoir properties that can be plausibly altered to get a history-matched model. This issue is known as a parameter-identification problem, and its solution usually represents a significant step in HM projects. In this paper, we propose a practical implementation of a multiscale approach aimed at identifying effective parameterizations in real-life HM problems. The approach requires the availability of gradient simulators capable of providing the user with derivatives of the objective function with respect to the parameters at hand. Objective-function derivatives can then be used in a multiscale setting to define a sequence of richer and richer parameterizations. At each step of the sequence, the matching of the production data is improved by means of a gradient-based optimization. The methodology was validated on a synthetic case and was applied to history match the simulation model of a North Sea oil reservoir. The proposed methodology can be considered a practical solution for parameter-identification problems in many real cases until sound methodologies (primarily adaptive multiscale estimation of parameters) become available in commercial software programs. Introduction Predictions of reservoir behavior require the definition of subsurface properties at the scale of the simulation grid cells. At this scale, a reliable description of the porous media requires us to build a reservoir model by integrating all the available sources of data. By their nature, we can categorize the data as prior and production data. Prior data can be seen as "direct" measures or representations of the reservoir properties. Production data include flow measures collected at wells [e.g., water cut, gas/oil ratio (GOR) and shut-in pressure, and time-lapse seismic data]. Prior data are directly incorporated in the setup of the reservoir model, typically in the framework of well-established reservoir-characterization workflows.

2001 ◽  
Vol 41 (1) ◽  
pp. 679
Author(s):  
S. Reymond ◽  
E. Matthews ◽  
B. Sissons

This case study illustrates how 3D generalised inversion of seismic facies for reservoir parameters can be successfully applied to image and laterally predict reservoir parameters in laterally discontinuous turbiditic depositional environment where hydrocarbon pools are located in complex combined stratigraphic-structural traps. Such conditions mean that structural mapping is inadequate to define traps and to estimate reserves in place. Conventional seismic amplitude analysis has been used to aid definition but was not sufficient to guarantee presence of economic hydrocarbons in potential reservoir pools. The Ngatoro Field in Taranaki, New Zealand has been producing for nine years. Currently the field is producing 1,000 bopd from seven wells and at three surface locations down from a peak of over 1,500 bopd. The field production stations have been analysed using new techniques in 3D seismic imaging to locate bypassed oils and identify undrained pools. To define the objectives of the study, three questions were asked:Can we image reservoir pools in a complex stratigraphic and structural environment where conventional grid-based interpretation is not applicable due to lack of lateral continuity in reservoir properties?Can we distinguish fluids within each reservoir pools?Can we extrapolate reservoir parameters observed at drilled locations to the entire field using 3D seismic data to build a 3D reservoir model?Using new 3D seismic attributes such as bright spot indicators, attenuation and edge enhancing volumes coupled with 6 AVO (Amplitude Versus Offset) volumes integrated into a single class cube of reservoir properties, made the mapping of reservoir pools possible over the entire data set. In addition, four fluid types, as observed in more than 20 reservoir pools were validated by final inverted results to allow lateral prediction of fluid contents in un-drilled reservoir targets. Well production data and 3D seismic inverted volume were later integrated to build a 3D reservoir model to support updated volumetrics reserves computation and to define additional targets for exploration drilling, additional well planning and to define a water injection plan for pools already in production.


SPE Journal ◽  
2006 ◽  
Vol 11 (04) ◽  
pp. 464-479 ◽  
Author(s):  
B. Todd Hoffman ◽  
Jef K. Caers ◽  
Xian-Huan Wen ◽  
Sebastien B. Strebelle

Summary This paper presents an innovative methodology to integrate prior geologic information, well-log data, seismic data, and production data into a consistent 3D reservoir model. Furthermore, the method is applied to a real channel reservoir from the African coast. The methodology relies on the probability-perturbation method (PPM). Perturbing probabilities rather than actual petrophysical properties guarantees that the conceptual geologic model is maintained and that any history-matching-related artifacts are avoided. Creating reservoir models that match all types of data are likely to have more prediction power than methods in which some data are not honored. The first part of the paper reviews the details of the PPM, and the next part of this paper describes the additional work that is required to history-match real reservoirs using this method. Then, a geological description of the reservoir case study is provided, and the procedure to build 3D reservoir models that are only conditioned to the static data is covered. Because of the character of the field, the channels are modeled with a multiple-point geostatistical method. The channel locations are perturbed in a manner such that the oil, water, and gas rates from the reservoir more accurately match the rates observed in the field. Two different geologic scenarios are used, and multiple history-matched models are generated for each scenario. The reservoir has been producing for approximately 5 years, but the models are matched only to the first 3 years of production. Afterward, to check predictive power, the matched models are run for the last 1½ years, and the results compare favorably with the field data. Introduction Reservoir models are constructed to better understand reservoir behavior and to better predict reservoir response. Economic decisions are often based on the predictions from reservoir models; therefore, such predictions need to be as accurate as possible. To achieve this goal, the reservoir model should honor all sources of data, including well-log, seismic, geologic information, and dynamic (production rate and pressure) data. Incorporating dynamic data into the reservoir model is generally known as history matching. History matching is difficult because it poses a nonlinear inverse problem in the sense that the relationship between the reservoir model parameters and the dynamic data is highly nonlinear and multiple solutions are avail- able. Therefore, history matching is often done with a trial-and-error method. In real-world applications of history matching, reservoir engineers manually modify an initial model provided by geoscientists until the production data are matched. The initial model is built based on geological and seismic data. While attempts are usually made to honor these other data as much as possible, often the history-matched models are unrealistic from a geological (and geophysical) point of view. For example, permeability is often altered to increase or decrease flow in areas where a mismatch is observed; however, the permeability alterations usually come in the form of box-shaped or pipe-shaped geometries centered around wells or between wells and tend to be devoid of any geologica. considerations. The primary focus lies in obtaining a history match.


1999 ◽  
Vol 2 (05) ◽  
pp. 470-477 ◽  
Author(s):  
Daniel Rahon ◽  
Paul Francis Edoa ◽  
Mohamed Masmoudi

Summary This paper discusses a method which helps identify the geometry of geological features in an oil reservoir by history matching of production data. Following an initial study on single-phase flow and applied to well tests (Rahon, D., Edoa, P. F., and Masmoudi, M.: "Inversion of Geological Shapes in Reservoir Engineering Using Well Tests and History Matching of Production Data," paper SPE 38656 presented at the 1997 SPE Annual Technical Conference and Exhibition, San Antonio, Texas, 5–8 October.), the research presented here was conducted in a multiphase flow context. This method provides information on the limits of a reservoir being explored, the position and size of faults, and the thickness and dimensions of channels. The approach consists in matching numerical flow simulation results with production measurements. This is achieved by modifying the geometry of the geological model. The identification of geometric parameters is based on the solution of an inverse problem and boils down to minimizing an objective function integrating the production data. The minimization algorithm is rendered very efficient by calculating the gradients of the objective function with respect to perturbations of these geometric parameters. This leads to a better characterization of the shape, the dimension, and the position of sedimentary bodies. Several examples are presented in this paper, in particular, an application of the method in a two-phase water/oil case. Introduction A number of semiautomatic history matching techniques have been developed in recent years to assist the reservoir engineer in his reservoir characterization task. These techniques are generally based on the resolution of an inverse problem by the minimization of an objective function and require the use of a numerical simulator. The matching parameters of the inverse problem comprise two types of properties: petrophysical/porosity and permeability and geometric position, shape, and size of the sedimentary bodies present in the reservoir. To be efficient, minimization algorithms require the calculation of simulated production gradients with respect to matching parameters. Such gradients are usually calculated by deriving discrete state equations solved in the numerical simulator1–5 or by using a so-called adjoint-state method.6,7 Therefore, most of these gradient-based methods only allow the identification of petrophysical parameters which appear explicitly in the discrete equations of state. The case of geometric parameters is much more complex, as the gradients of the objective function with respect to these parameters cannot be determined directly from the flow equation. Recent works8–10 have handled this problem by defining geological objects using mathematical functions to describe porosity or permeability fields. But, generalizing these solutions to complex geological models remains difficult. The method proposed in this paper is well suited to complex geometries and heterogeneous environments. The history matching parameters are the geometric elements that describe the geological objects generated, for example, with a geomodeling tool. A complete description of the method with the calculation of the sensitivities was presented in Ref. 11, within the particular framework of single-phase flow adapted to well-test interpretations. In this paper we will introduce an extension of the method to multiphase equations in order to match production data. Several examples are presented, illustrating the efficiency of this technique in a two-phase context. Description of the Method The objective is to develop an automatic or semiautomatic history matching method which allows identification of geometric parameters that describe geological shapes using a numerical simulator. To be efficient, the optimization process requires the calculation of objective function gradients with respect to the parameters. With usual fluid flow simulators using a regular grid or corner point geometry, the conventional methods for calculating well response gradients on discrete equations are not readily usable when dealing with geometric parameters. These geometric parameters do not appear explicitly in the model equations. With these kinds of structured models the solution is to determine the expression of the sensitivities of the objective function in the continuous problem using mathematical theory and then to calculate a discrete set of gradients. Sensitivity Calculation. Here, we present a sensitivity calculation to the displacement of a geological body in a two-phase water/oil flow context. State Equations. Let ? be a two- or three-dimensional spatial field, with a boundary ? and let ]0,T[ be the time interval covering the pressure history. We assume that the capillary pressure is negligible. The pressure p and the water saturation S corresponding to a two-phase flow in the domain ? are governed by the following equations: ∂ ϕ ( p ) S ∂ t − ∇ . ( k k r o ( S ) μ o ∇ ( p + ρ o g z ) ) = q o ρ o , ∂ ϕ ( p ) S ∂ t − ∇ . ( k k r w ( S ) μ w ∇ ( p + ρ w g z ) ) = q w ρ w , ( x , y , z ) ∈ Ω , t ∈ ] 0 , T [ , ( 1 ) with a no-flux boundary condition on ? and an initial equilibrium condition


Author(s):  
M. Syafwan

This paper presents a fit-for-purpose approach to mitigate zonal production data allocation uncertainty during history matching of a reservoir simulation model due to limited production logging data. To avoid propagating perforation/production zone allocation uncertainty at commingled wells into the history matched reservoir model, only well-level production data from historical periods when production was from a single zone were used to calibrate reservoir properties that determine initial volumetric. Then, during periods of the history with commingled production, average reservoir pressure measurements were integrated into the model to allocate fluid production to the target reservoir. Last, the periods constrained by dedicated well-level fluid production and average reservoir pressure were merged over the forty-eight-year history to construct a single history matched reservoir model in preparation for waterflood performance forecasting. This innovative history matching approach, which mitigates the impacts of production allocation uncertainty by using different intervals of the historical data to calibrate model saturations and model pressures, has provided a new interpretation of OOIP and current recovery factor, as well as drive mechanisms including aquifer strength and capillary pressure. Fluid allocation from the target reservoir in the history matched model is 85% lower than previously estimated. The history matched model was used as a quantitative forecasting and optimization tool to expand the recent waterflood with improved production forecast reliability. The remaining mobile oil saturation map and streamline-based waterflood diagnostics have improved understanding of injector-producer connectivity and swept pore volumes, e.g., current swept volumes are minor and well-centric with limited indication of breakthrough at adjacent producers resulting in high remaining mobile oil saturation. Accordingly, the history matched model provides a foundation to select new injection points, determine dedicated producer locations and support optimized injection strategies to improve recovery.


2006 ◽  
Vol 9 (05) ◽  
pp. 502-512 ◽  
Author(s):  
Arne Skorstad ◽  
Odd Kolbjornsen ◽  
Asmund Drottning ◽  
Havar Gjoystdal ◽  
Olaf K. Huseby

Summary Elastic seismic inversion is a tool frequently used in analysis of seismic data. Elastic inversion relies on a simplified seismic model and generally produces 3D cubes for compressional-wave velocity, shear-wave velocity, and density. By applying rock-physics theory, such volumes may be interpreted in terms of lithology and fluid properties. Understanding the robustness of forward and inverse techniques is important when deciding the amount of information carried by seismic data. This paper suggests a simple method to update a reservoir characterization by comparing 4D-seismic data with flow simulations on an existing characterization conditioned on the base-survey data. The ability to use results from a 4D-seismic survey in reservoir characterization depends on several aspects. To investigate this, a loop that performs independent forward seismic modeling and elastic inversion at two time stages has been established. In the workflow, a synthetic reservoir is generated from which data are extracted. The task is to reconstruct the reservoir on the basis of these data. By working on a realistic synthetic reservoir, full knowledge of the reservoir characteristics is achieved. This makes the evaluation of the questions regarding the fundamental dependency between the seismic and petrophysical domains stronger. The synthetic reservoir is an ideal case, where properties are known to an accuracy never achieved in an applied situation. It can therefore be used to investigate the theoretical limitations of the information content in the seismic data. The deviations in water and oil production between the reference and predicted reservoir were significantly decreased by use of 4D-seismic data in addition to the 3D inverted elastic parameters. Introduction It is well known that the information in seismic data is limited by the bandwidth of the seismic signal. 4D seismics give information on the changes between base and monitor surveys and are consequently an important source of information regarding the principal flow in a reservoir. Because of its limited resolution, the presence of a thin thief zone can be observed only as a consequence of flow, and the exact location will not be found directly. This paper addresses the question of how much information there is in the seismic data, and how this information can be used to update the model for petrophysical reservoir parameters. Several methods for incorporating 4D-seismic data in the reservoir-characterization workflow for improving history matching have been proposed earlier. The 4D-seismic data and the corresponding production data are not on the same scale, but they need to be combined. Huang et al. (1997) proposed a simulated annealing method for conditioning these data, while Lumley and Behrens (1997) describe a workflow loop in which the 4D-seismic data are compared with those computed from the reservoir model. Gosselin et al. (2003) give a short overview of the use of 4D-seismic data in reservoir characterization and propose using gradient-based methods for history matching the reservoir model on seismic and production data. Vasco et al. (2004) show that 4D data contain information of large-scale reservoir-permeability variations, and they illustrate this in a Gulf of Mexico example.


SPE Journal ◽  
2007 ◽  
Vol 12 (04) ◽  
pp. 408-419 ◽  
Author(s):  
Baoyan Li ◽  
Francois Friedmann

Summary History matching is an inverse problem in which an engineer calibrates key geological/fluid flow parameters by fitting a simulator's output to the real reservoir production history. It has no unique solution because of insufficient constraints. History-match solutions are obtained by searching for minima of an objective function below a preselected threshold value. Experimental design and response surface methodologies provide an efficient approach to build proxies of objective functions (OF) for history matching. The search for minima can then be easily performed on the proxies of OF as long as its accuracy is acceptable. In this paper, we first introduce a novel experimental design methodology for semi-automatically selecting the sampling points, which are used to improve the accuracy of constructed proxies of the nonlinear OF. This method is based on derivatives of constructed proxies. We propose an iterative procedure for history matching, applying this new design methodology. To obtain the global optima, the proxies of an objective function are initially constructed on the global parameter space. They are iteratively improved until adequate accuracy is achieved. We locate subspaces in the vicinity of the optima regions using a clustering technique to improve the accuracy of the reconstructed OF in these subspaces. We test this novel methodology and history-matching procedure with two waterflooded reservoir models. One model is the Imperial College fault model (Tavassoli et al. 2004). It contains a large bank of simulation runs. The other is a modified version of SPE9 (Killough 1995) benchmark problem. We demonstrate the efficiency of this newly developed history-matching technique. Introduction History matching (Eide et al. 1994; Landa and Güyagüler 2003) is an inverse problem in which an engineer calibrates key geological/fluid flow parameters of reservoirs by fitting a reservoir simulator's output to the real reservoir production history. It has no unique solution because of insufficient constraints. The traditional history matching is performed in a semi-empirical approach, which is based on the engineer's understanding of the field production behavior. Usually, the model parameters are adjusted using a one-factor-at-a-time approach. History matching can be very time consuming, because many simulation runs may be required for obtaining good fitting results. Attempts have been made to automate the history-matching process by using optimal control theory (Chen et al. 1974) and gradient techniques (Gomez et al. 2001). Also, design of experiment (DOE) and response surface methodologies (Eide et al. 1994; Box and Wilson 1987; Montgomery 2001; Box and Hunter 1957; Box and Wilson 1951; Damsleth et al. 1992; Egeland et al. 1992; Friedmann et al. 2003) (RSM) were introduced in the late 1990s to guide automatic history matching. The goal of these automatic methods is to achieve reasonably faster history-matching techniques than the traditional method. History matching is an optimization problem. The objective is to find the best of all possible sets of geological/fluid flow parameters to fit the production data of reservoirs. To assess the quality of the match, we define an OF (Atallah 1999). For history-matching problems, an objective function is usually defined as a distance (Landa and Güyagüler 2003) between a simulator's output and reservoir production data. History-matching solutions are obtained by searching for minima of the objective function. Experimental design and response surface methodologies provide an efficient approach to build up hypersurfaces (Kecman 2001) of objective functions (i.e., proxies of objective functions with a limited number of simulation runs for history matching). The search for minima can then be easily performed on these proxies as long as their accuracy is acceptable. The efficiency of this technique depends on constructing adequately accurate objective functions.


2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Qianqian Wang ◽  
Minan Tang ◽  
Aimin An ◽  
Jiawei Lu ◽  
Yingying Zhao

<p style='text-indent:20px;'>Impurity removal is a momentous part of zinc hydrometallurgy process, and the quality of products and the stability of the whole process are affected directly by its control effect. The application of dynamic model is of great significance to the prediction of key indexes and the optimization of process control. In this paper, considering the complex coupling relationship of stage II purification process, a hybrid modeling method of mechanism modeling and parameter identification modeling was proposed on the basis of not changing the actual production process of lead-zinc smeltery. Firstly, the overall nonlinear dynamic mechanism model was established, and then the deviation between the theoretical value and the actual detected outlet ion concentration was taken as the objective function to establish the parameter identification optimization model. Since the built model is nonlinear, it may pose implementation problems. On the premise of deriving the gradient vector and Hessian matrix of the objective function with respect to the parameter vector, an optimization algorithm based on the steepest descent method and Newton method is proposed. Finally, using the historical production data of a lead-zinc smeltery in China, the model parameters were accurately inversed. An intensive simulation validation and analysis of the dynamic characteristics about the whole model shows the accuracy and the potential of the model, also in the perspective of practical implementation, which provides the basis for the optimal control of system output and the guidance for the optimal control of zinc powder addition.</p>


2005 ◽  
Vol 8 (03) ◽  
pp. 214-223 ◽  
Author(s):  
Fengjun Zhang ◽  
Jan-Arild Skjervheim ◽  
Albert C. Reynolds ◽  
Dean S. Oliver

Summary The Bayesian framework allows one to integrate production and static data into an a posteriori probability density function (pdf) for reservoir variables(model parameters). The problem of generating realizations of the reservoir variables for the assessment of uncertainty in reservoir description or predicted reservoir performance then becomes a problem of sampling this a posteriori pdf to obtain a suite of realizations. Generation of a realization by the randomized-maximum-likelihood method requires the minimization of an objective function that includes production-data misfit terms and a model misfit term that arises from a prior model constructed from static data. Minimization of this objective function with an optimization algorithm is equivalent to the automatic history matching of production data, with a prior model constructed from static data providing regularization. Because of the computational cost of computing sensitivity coefficients and the need to solve matrix problems involving the covariance matrix for the prior model, this approach has not been applied to problems in which the number of data and the number of reservoir-model parameters are both large and the forward problem is solved by a conventional finite-difference simulator. In this work, we illustrate that computational efficiency problems can be overcome by using a scaled limited-memory Broyden-Fletcher-Goldfarb-Shanno (LBFGS) algorithm to minimize the objective function and by using approximate computational stencils to approximate the multiplication of a vector by the prior covariance matrix or its inverse. Implementation of the LBFGS method requires only the gradient of the objective function, which can be obtained from a single solution of the adjoint problem; individual sensitivity coefficients are not needed. We apply the overall process to two examples. The first is a true field example in which a realization of log permeabilities at26,019 gridblocks is generated by the automatic history matching of pressure data, and the second is a pseudo field example that provides a very rough approximation to a North Sea reservoir in which a realization of log permeabilities at 9,750 gridblocks is computed by the automatic history matching of gas/oil ratio (GOR) and pressure data. Introduction The Bayes theorem provides a general framework for updating a pdf as new data or information on the model becomes available. The Bayesian setting offers a distinct advantage. If one can generate a suite of realizations that represent a correct sampling of the a posteriori pdf, then the suite of samples provides an assessment of the uncertainty in reservoir variables. Moreover, by predicting future reservoir performance under proposed operating conditions for each realization, one can characterize the uncertainty in future performance predictions by constructing statistics for the set of outcomes. Liu and Oliver have recently presented a comparison of methods for sampling the a posteriori pdf. Their results indicate that the randomized-maximum-likelihood method is adequate for evaluating uncertainty with a relatively limited number of samples. In this work, we consider the case in which a prior geostatistical model constructed from static data is available and is represented by a multivariate Gaussian pdf. Then, the a posteriori pdf conditional to production data is such that calculation of the maximum a posteriori estimate or generation of a realization by the randomized-maximum-likelihood method is equivalent to the minimization of an appropriate objective function. History-matching problems of interest to us involve a few thousand to tens of thousands of reservoir variables and a few hundred to a few thousand production data. Thus, an optimization algorithm suitable for large-scale problems is needed. Our belief is that nongradient-based algorithms such as simulated annealing and the genetic algorithm are not competitive with gradient-based algorithms in terms of computational efficiency. Classical gradient-based algorithms such as the Gauss-Newton and Levenberg-Marquardt typically converge fairly quickly and have been applied successfully to automatic history matching for both single-phase- and multiphase-flow problems. No multiphase-flow example considered in these papers involved more than 1,500reservoir variables. For single-phase-flow problems, He et al. and Reynolds et al. have generated realizations of models involving up to 12,500 reservoir variables by automatic history matching of pressure data. However, they used a procedure based on their generalization of the method of Carter et al. to calculate sensitivity coefficients; this method assumes that the partial-differential equation solved by reservoir simulation is linear and does not apply for multiphase-flow problems.


SPE Journal ◽  
2010 ◽  
Vol 16 (02) ◽  
pp. 307-317 ◽  
Author(s):  
Yanfen Zhang ◽  
Dean S. Oliver

Summary The increased use of optimization in reservoir management has placed greater demands on the application of history matching to produce models that not only reproduce the historical production behavior but also preserve geological realism and quantify forecast uncertainty. Geological complexity and limited access to the subsurface typically result in a large uncertainty in reservoir properties and forecasts. However, there is a systematic tendency to underestimate such uncertainty, especially when rock properties are modeled using Gaussian random fields. In this paper, we address one important source of uncertainty: the uncertainty in regional trends by introducing stochastic trend coefficients. The multiscale parameters including trend coefficients and heterogeneities can be estimated using the ensemble Kalman filter (EnKF) for history matching. Multiscale heterogeneities are often important, especially in deepwater reservoirs, but are generally poorly represented in history matching. In this paper, we describe a method for representing and updating multiple scales of heterogeneity in the EnKF. We tested our method for updating these variables using production data from a deepwater field whose reservoir model has more than 200,000 unknown parameters. The match of reservoir simulator forecasts to real field data using a standard application of EnKF had not been entirely satisfactory because it was difficult to match the water cut of a main producer in the reservoir. None of the realizations of the reservoir exhibited water breakthrough using the standard parameterization method. By adding uncertainty in large-scale trends of reservoir properties, the ability to match the water cut and other production data was improved substantially. The results indicate that an improvement in the generation of the initial ensemble and in the variables describing the property fields gives an improved history match with plausible geology. The multiscale parameterization of property fields reduces the tendency to underestimate uncertainty while still providing reservoir models that match data.


Sign in / Sign up

Export Citation Format

Share Document