Seismic History Matching of Nelson Using Time-Lapse Seismic Data: An Investigation of 4D Signature Normalization

2011 ◽  
Vol 14 (05) ◽  
pp. 621-633 ◽  
Author(s):  
Alireza Kazemi ◽  
Karl D. Stephen ◽  
Asghar Shams

Summary History matching of a reservoir model is always a difficult task. In some fields, we can use time-lapse (4D) seismic data to detect production-induced changes as a complement to more conventional production data. In seismic history matching, we predict these data and compare to observations. Observed time-lapse data often consist of relative measures of change, which require normalization. We investigate different normalization approaches, based on predicted 4D data, and assess their impact on history matching. We apply the approach to the Nelson field in which four surveys are available over 9 years of production. We normalize the 4D signature in a number of ways. First, we use predictions of 4D signature from vertical wells that match production, and we derive a normalization function. As an alternative, we use crossplots of the full-field prediction against observation. Normalized observations are used in an automatic-history-matching process, in which the model is updated. We analyze the results of the two normalization approaches and compare against the case of just using production data. The result shows that when we use 4D data normalized to wells, we obtain 49% reduced misfit along with 36% improvement in predictions. Also over the whole reservoir, 8 and 7% reduction of misfits for 4D seismic are obtained in history and prediction periods, respectively. When we use only production data, the production history match is improved to a similar degree (45%), but in predictions, the improvement is only 25% and the 4D seismic misfit is 10% worse. Finding the unswept areas in the reservoir is always a challenge in reservoir management. By using 4D data in history matching, we can better predict reservoir behavior and identify regions of remaining oil.

SPE Journal ◽  
2010 ◽  
Vol 15 (04) ◽  
pp. 1077-1088 ◽  
Author(s):  
F.. Sedighi ◽  
K.D.. D. Stephen

Summary Seismic history matching is the process of modifying a reservoir simulation model to reproduce the observed production data in addition to information gained through time-lapse (4D) seismic data. The search for good predictions requires that many models be generated, particularly if there is an interaction between the properties that we change and their effect on the misfit to observed data. In this paper, we introduce a method of improving search efficiency by estimating such interactions and partitioning the set of unknowns into noninteracting subspaces. We use regression analysis to identify the subspaces, which are then searched separately but simultaneously with an adapted version of the quasiglobal stochastic neighborhood algorithm. We have applied this approach to the Schiehallion field, located on the UK continental shelf. The field model, supplied by the operator, contains a large number of barriers that affect flow at different times during production, and their transmissibilities are highly uncertain. We find that we can successfully represent the misfit function as a second-order polynomial dependent on changes in barrier transmissibility. First, this enables us to identify the most important barriers, and, second, we can modify their transmissibilities efficiently by searching subgroups of the parameter space. Once the regression analysis has been performed, we reduce the number of models required to find a good match by an order of magnitude. By using 4D seismic data to condition saturation and pressure changes in history matching effectively, we have gained a greater insight into reservoir behavior and have been able to predict flow more accurately with an efficient inversion tool. We can now determine unswept areas and make better business decisions.


2006 ◽  
Vol 9 (05) ◽  
pp. 502-512 ◽  
Author(s):  
Arne Skorstad ◽  
Odd Kolbjornsen ◽  
Asmund Drottning ◽  
Havar Gjoystdal ◽  
Olaf K. Huseby

Summary Elastic seismic inversion is a tool frequently used in analysis of seismic data. Elastic inversion relies on a simplified seismic model and generally produces 3D cubes for compressional-wave velocity, shear-wave velocity, and density. By applying rock-physics theory, such volumes may be interpreted in terms of lithology and fluid properties. Understanding the robustness of forward and inverse techniques is important when deciding the amount of information carried by seismic data. This paper suggests a simple method to update a reservoir characterization by comparing 4D-seismic data with flow simulations on an existing characterization conditioned on the base-survey data. The ability to use results from a 4D-seismic survey in reservoir characterization depends on several aspects. To investigate this, a loop that performs independent forward seismic modeling and elastic inversion at two time stages has been established. In the workflow, a synthetic reservoir is generated from which data are extracted. The task is to reconstruct the reservoir on the basis of these data. By working on a realistic synthetic reservoir, full knowledge of the reservoir characteristics is achieved. This makes the evaluation of the questions regarding the fundamental dependency between the seismic and petrophysical domains stronger. The synthetic reservoir is an ideal case, where properties are known to an accuracy never achieved in an applied situation. It can therefore be used to investigate the theoretical limitations of the information content in the seismic data. The deviations in water and oil production between the reference and predicted reservoir were significantly decreased by use of 4D-seismic data in addition to the 3D inverted elastic parameters. Introduction It is well known that the information in seismic data is limited by the bandwidth of the seismic signal. 4D seismics give information on the changes between base and monitor surveys and are consequently an important source of information regarding the principal flow in a reservoir. Because of its limited resolution, the presence of a thin thief zone can be observed only as a consequence of flow, and the exact location will not be found directly. This paper addresses the question of how much information there is in the seismic data, and how this information can be used to update the model for petrophysical reservoir parameters. Several methods for incorporating 4D-seismic data in the reservoir-characterization workflow for improving history matching have been proposed earlier. The 4D-seismic data and the corresponding production data are not on the same scale, but they need to be combined. Huang et al. (1997) proposed a simulated annealing method for conditioning these data, while Lumley and Behrens (1997) describe a workflow loop in which the 4D-seismic data are compared with those computed from the reservoir model. Gosselin et al. (2003) give a short overview of the use of 4D-seismic data in reservoir characterization and propose using gradient-based methods for history matching the reservoir model on seismic and production data. Vasco et al. (2004) show that 4D data contain information of large-scale reservoir-permeability variations, and they illustrate this in a Gulf of Mexico example.


Author(s):  
B. T. Ojo ◽  
M. T. Olowokere ◽  
M. I. Oladapo

Poor or low data quality usually has an adverse effect on the quantitative usage of (4D) seismic data for accurate analysis. Repeatability of 4D Seismic or time-lapse survey is considered as a vital tool for effective, potent, and impressive monitoring of productivity of reservoirs. Inconsistencies and disagreement of ‘time-lapse’ data will greatly affect the accuracy and outcome of research when comparing two or more seismic surveys having low repeatability. Correlation is a statistic procedure that measures the linear relation between all points of two variables. Error due to acquisition and processing must be checked for before interpretation in order to minimize exploration failure and the number of dry holes drilled. The seismic data available for this study comprises of 779 crosslines and 494 inlines. The 4D seismic data consisting of the base Seismic shot in 1998 before production and the monitor Seismic shot in 2010 at different stages of hydrocarbon production were cross correlated to ascertain repeatability between the two vintages. A global average matching process was applied while phase and time shift were estimated using the Russell-Liang technique. Two pass full shaping filters were applied for the phase matching. Maximum and minimum ‘cross-correlation’ are 0.85 (85%) and 0.60 (60%) respectively. Statistics of the ‘cross-correlation’ shift show standard deviation  (0.3), variance (0.12), and root mean square (0.78). For high percentage repeatability and maximum correlations, the requested correlation threshold is 0.7 but 1 and 0.99 were obtained for the first and the second matching respectively.  Conclusively, the overall results show that there is high repeatability between the 4D seismic data used and the data can be employed conveniently for accurate ‘time-lapse’ (future) production monitoring and investigation on the field.


1998 ◽  
Vol 17 (10) ◽  
pp. 1430-1433 ◽  
Author(s):  
Xuri Huang ◽  
Laurent Meister ◽  
Rick Workman

Geophysics ◽  
2019 ◽  
Vol 85 (1) ◽  
pp. M15-M31 ◽  
Author(s):  
Mingliang Liu ◽  
Dario Grana

We have developed a time-lapse seismic history matching framework to assimilate production data and time-lapse seismic data for the prediction of static reservoir models. An iterative data assimilation method, the ensemble smoother with multiple data assimilation is adopted to iteratively update an ensemble of reservoir models until their predicted observations match the actual production and seismic measurements and to quantify the model uncertainty of the posterior reservoir models. To address computational and numerical challenges when applying ensemble-based optimization methods on large seismic data volumes, we develop a deep representation learning method, namely, the deep convolutional autoencoder. Such a method is used to reduce the data dimensionality by sparsely and approximately representing the seismic data with a set of hidden features to capture the nonlinear and spatial correlations in the data space. Instead of using the entire seismic data set, which would require an extremely large number of models, the ensemble of reservoir models is iteratively updated by conditioning the reservoir realizations on the production data and the low-dimensional hidden features extracted from the seismic measurements. We test our methodology on two synthetic data sets: a simplified 2D reservoir used for method validation and a 3D application with multiple channelized reservoirs. The results indicate that the deep convolutional autoencoder is extremely efficient in sparsely representing the seismic data and that the reservoir models can be accurately updated according to production data and the reparameterized time-lapse seismic data.


1997 ◽  
Author(s):  
Xuri Huang ◽  
Laurent Meister ◽  
Rick Workman

SPE Journal ◽  
2017 ◽  
Vol 22 (04) ◽  
pp. 1261-1279 ◽  
Author(s):  
Shingo Watanabe ◽  
Jichao Han ◽  
Gill Hetz ◽  
Akhil Datta-Gupta ◽  
Michael J. King ◽  
...  

Summary We present an efficient history-matching technique that simultaneously integrates 4D repeat seismic surveys with well-production data. This approach is particularly well-suited for the calibration of the reservoir properties of high-resolution geologic models because the seismic data are areally dense but sparse in time, whereas the production data are finely sampled in time but spatially averaged. The joint history matching is performed by use of streamline-based sensitivities derived from either finite-difference or streamline-based flow simulation. For the most part, earlier approaches have focused on the role of saturation changes, but the effects of pressure have largely been ignored. Here, we present a streamline-based semianalytic approach for computing model-parameter sensitivities, accounting for both pressure and saturation effects. The novelty of the method lies in the semianalytic sensitivity computations, making it computationally efficient for high-resolution geologic models. The approach is implemented by use of a finite-difference simulator incorporating the detailed physics. Its efficacy is demonstrated by use of both synthetic and field applications. For both the synthetic and the field cases, the advantages of incorporating the time-lapse variations are clear, seen through the improved estimation of the permeability distribution, the pressure profile, the evolution of the fluid saturation, and the swept volumes.


Sign in / Sign up

Export Citation Format

Share Document