Quantitative 4D Seismic Assisted History Matching Using Ensemble-Based Methods on the Vilje Field

2020 ◽  
Author(s):  
Konrad Wojnar ◽  
Jon S?trom ◽  
Tore Felix Munck ◽  
Martha Stunell ◽  
Stig Sviland-Østre ◽  
...  

Abstract The aim of the study was to create an ensemble of equiprobable models that could be used for improving the reservoir management of the Vilje field. Qualitative and quantitative workflows were developed to systematically and efficiently screen, analyze and history match an ensemble of reservoir simulation models to production and 4D seismic data. The goal of developing the workflows is to increase the utilization of data from 4D seismic surveys for reservoir characterization. The qualitative and quantitative workflows are presented, describing their benefits and challenges. The data conditioning produced a set of history matched reservoir models which could be used in the field development decision making process. The proposed workflows allowed for identification of outlying prior and posterior models based on key features where observed data was not covered by the synthetic 4D seismic realizations. As a result, suggestions for a more robust parameterization of the ensemble were made to improve data coverage. The existing history matching workflow efficiently integrated with the quantitative 4D seismic history matching workflow allowing for the conditioning of the reservoir models to production and 4D data. Thus, the predictability of the models was improved. This paper proposes a systematic and efficient workflow using ensemble-based methods to simultaneously screen, analyze and history match production and 4D seismic data. The proposed workflow improves the usability of 4D seismic data for reservoir characterization, and in turn, for the reservoir management and the decision-making processes.

2009 ◽  
Vol 12 (03) ◽  
pp. 446-454 ◽  
Author(s):  
Frode Georgsen ◽  
Anne R. Syversveen ◽  
Ragnar Hauge ◽  
Jan I. Tollefsrud ◽  
Morten Fismen

Summary The possibility of updating reservoir models with new well information is important for good reservoir management. The process of drilling a new well through to update of the static model and to history match the new model is often a time-consuming process. This paper presents new algorithms that allow the rapid updating of object-based facies models by further development of already existing models. An existing facies realization is adjusted to match new well observations by changing objects locally or adding/removing objects if required. Parts of the realization that are not influenced by the new wells are not changed. A local update of a specified region of the reservoir can be performed, leaving the rest of the reservoir unchanged or with minimum change because of new wells. In this method, the main focus is the algorithm implemented to fulfill well conditioning. The effect of this algorithm on different object models is presented through several case studies. These studies show how the local update consistently includes new information while leaving the rest of the realization unperturbed, thereby preserving the good history match. Introduction Rapid updating of static and dynamic reservoir models is important for reservoir management. Continual maintenance of history-matched models allows for right-time decisions to optimize the reservoir performance. The process of drilling a new well through to updating of the static model and history matching of the new model is often a time-consuming process. Static reservoir models and history matches are updated only intermittently, and there is typically a 1- to 2-year delay between the drilling of a new well and the generation of a reliable history-matched model that incorporates the new information. This paper presents new algorithms that allow rapid updating of static reservoir models when new wells are drilled. The static-model update is designed to keep as much of the existing history match as possible by locally adjusting the existing static model to the new well data. As the name implies, object models use a set of facies objects to generate a facies realization. Stochastic object-modeling algorithms have been developed to improve the representation of facies architectures in complex heterogeneous reservoirs and, thereby, to obtain more-realistic dynamic behavior of the reservoir models. We consider the main advantages of object models to be the ability to create geologically realistic facies elements (objects) and control the interaction between them, to correlate observations between wells (connectivity) explicitly, and the possibility of applying intraobject petrophysical trends.


SPE Journal ◽  
2006 ◽  
Vol 11 (04) ◽  
pp. 464-479 ◽  
Author(s):  
B. Todd Hoffman ◽  
Jef K. Caers ◽  
Xian-Huan Wen ◽  
Sebastien B. Strebelle

Summary This paper presents an innovative methodology to integrate prior geologic information, well-log data, seismic data, and production data into a consistent 3D reservoir model. Furthermore, the method is applied to a real channel reservoir from the African coast. The methodology relies on the probability-perturbation method (PPM). Perturbing probabilities rather than actual petrophysical properties guarantees that the conceptual geologic model is maintained and that any history-matching-related artifacts are avoided. Creating reservoir models that match all types of data are likely to have more prediction power than methods in which some data are not honored. The first part of the paper reviews the details of the PPM, and the next part of this paper describes the additional work that is required to history-match real reservoirs using this method. Then, a geological description of the reservoir case study is provided, and the procedure to build 3D reservoir models that are only conditioned to the static data is covered. Because of the character of the field, the channels are modeled with a multiple-point geostatistical method. The channel locations are perturbed in a manner such that the oil, water, and gas rates from the reservoir more accurately match the rates observed in the field. Two different geologic scenarios are used, and multiple history-matched models are generated for each scenario. The reservoir has been producing for approximately 5 years, but the models are matched only to the first 3 years of production. Afterward, to check predictive power, the matched models are run for the last 1½ years, and the results compare favorably with the field data. Introduction Reservoir models are constructed to better understand reservoir behavior and to better predict reservoir response. Economic decisions are often based on the predictions from reservoir models; therefore, such predictions need to be as accurate as possible. To achieve this goal, the reservoir model should honor all sources of data, including well-log, seismic, geologic information, and dynamic (production rate and pressure) data. Incorporating dynamic data into the reservoir model is generally known as history matching. History matching is difficult because it poses a nonlinear inverse problem in the sense that the relationship between the reservoir model parameters and the dynamic data is highly nonlinear and multiple solutions are avail- able. Therefore, history matching is often done with a trial-and-error method. In real-world applications of history matching, reservoir engineers manually modify an initial model provided by geoscientists until the production data are matched. The initial model is built based on geological and seismic data. While attempts are usually made to honor these other data as much as possible, often the history-matched models are unrealistic from a geological (and geophysical) point of view. For example, permeability is often altered to increase or decrease flow in areas where a mismatch is observed; however, the permeability alterations usually come in the form of box-shaped or pipe-shaped geometries centered around wells or between wells and tend to be devoid of any geologica. considerations. The primary focus lies in obtaining a history match.


2006 ◽  
Vol 9 (05) ◽  
pp. 502-512 ◽  
Author(s):  
Arne Skorstad ◽  
Odd Kolbjornsen ◽  
Asmund Drottning ◽  
Havar Gjoystdal ◽  
Olaf K. Huseby

Summary Elastic seismic inversion is a tool frequently used in analysis of seismic data. Elastic inversion relies on a simplified seismic model and generally produces 3D cubes for compressional-wave velocity, shear-wave velocity, and density. By applying rock-physics theory, such volumes may be interpreted in terms of lithology and fluid properties. Understanding the robustness of forward and inverse techniques is important when deciding the amount of information carried by seismic data. This paper suggests a simple method to update a reservoir characterization by comparing 4D-seismic data with flow simulations on an existing characterization conditioned on the base-survey data. The ability to use results from a 4D-seismic survey in reservoir characterization depends on several aspects. To investigate this, a loop that performs independent forward seismic modeling and elastic inversion at two time stages has been established. In the workflow, a synthetic reservoir is generated from which data are extracted. The task is to reconstruct the reservoir on the basis of these data. By working on a realistic synthetic reservoir, full knowledge of the reservoir characteristics is achieved. This makes the evaluation of the questions regarding the fundamental dependency between the seismic and petrophysical domains stronger. The synthetic reservoir is an ideal case, where properties are known to an accuracy never achieved in an applied situation. It can therefore be used to investigate the theoretical limitations of the information content in the seismic data. The deviations in water and oil production between the reference and predicted reservoir were significantly decreased by use of 4D-seismic data in addition to the 3D inverted elastic parameters. Introduction It is well known that the information in seismic data is limited by the bandwidth of the seismic signal. 4D seismics give information on the changes between base and monitor surveys and are consequently an important source of information regarding the principal flow in a reservoir. Because of its limited resolution, the presence of a thin thief zone can be observed only as a consequence of flow, and the exact location will not be found directly. This paper addresses the question of how much information there is in the seismic data, and how this information can be used to update the model for petrophysical reservoir parameters. Several methods for incorporating 4D-seismic data in the reservoir-characterization workflow for improving history matching have been proposed earlier. The 4D-seismic data and the corresponding production data are not on the same scale, but they need to be combined. Huang et al. (1997) proposed a simulated annealing method for conditioning these data, while Lumley and Behrens (1997) describe a workflow loop in which the 4D-seismic data are compared with those computed from the reservoir model. Gosselin et al. (2003) give a short overview of the use of 4D-seismic data in reservoir characterization and propose using gradient-based methods for history matching the reservoir model on seismic and production data. Vasco et al. (2004) show that 4D data contain information of large-scale reservoir-permeability variations, and they illustrate this in a Gulf of Mexico example.


2019 ◽  
Vol 7 (3) ◽  
pp. SE123-SE130
Author(s):  
Yang Xue ◽  
Mariela Araujo ◽  
Jorge Lopez ◽  
Kanglin Wang ◽  
Gautam Kumar

Time-lapse (4D) seismic is widely deployed in offshore operations to monitor improved oil recovery methods including water flooding, yet its value for enhanced well and reservoir management is not fully realized due to the long cycle times required for quantitative 4D seismic data assimilation into dynamic reservoir models. To shorten the cycle, we have designed a simple inversion workflow to estimate reservoir property changes directly from 4D attribute maps using machine-learning (ML) methods. We generated tens of thousands of training samples by Monte Carlo sampling from the rock-physics model within reasonable ranges of the relevant parameters. Then, we applied ML methods to build the relationship between the reservoir property changes and the 4D attributes, and we used the learnings to estimate the reservoir property changes given the 4D attribute maps. The estimated reservoir property changes (e.g., water saturation changes) can be used to analyze injection efficiency, update dynamic reservoir models, and support reservoir management decisions. We can reduce the turnaround time from months to days, allowing early engagements with reservoir engineers to enhance integration. This accelerated data assimilation removes a deterrent for the acquisition of frequent 4D surveys.


2005 ◽  
Author(s):  
Ricardo Cunha Mattos Portella ◽  
Alexandre Anoze Emerick

2021 ◽  
Author(s):  
Ali Al-Turki ◽  
Obai Alnajjar ◽  
Majdi Baddourah ◽  
Babatunde Moriwawon

Abstract The algorithms and workflows have been developed to couple efficient model parameterization with stochastic, global optimization using a Multi-Objective Genetic Algorithm (MOGA) for global history matching, and coupled with an advanced workflow for streamline sensitivity-based inversion for fine-tuning. During parameterization the low-rank subsets of most influencing reservoir parameters are identified and propagated to MOGA to perform the field-level history match. Data misfits between the field historical data and simulation data are calculated with multiple realizations of reservoir models that quantify and capture reservoir uncertainty. Each generation of the optimization algorithms reduces the data misfit relative to the previous iteration. This iterative process continues until a satisfactory field-level history match is reached or there are no further improvements. The fine-tuning process of well-connectivity calibration is then performed with a streamlined sensitivity-based inversion algorithm to locally update the model to reduce well-level mismatch. In this study, an application of the proposed algorithms and workflow is demonstrated for model calibration and history matching. The synthetic reservoir model used in this study is discretized into millions of grid cells with hundreds of producer and injector wells. It is designed to generate several decades of production and injection history to evaluate and demonstrate the workflow. In field-level history matching, reservoir rock properties (e.g., permeability, fault transmissibility, etc.) are parameterized to conduct the global match of pressure and production rates. Grid Connectivity Transform (GCT) was used and assessed to parameterize the reservoir properties. In addition, the convergence rate and history match quality of MOGA was assessed during the field (global) history matching. Also, the effectiveness of the streamline-based inversion was evaluated by quantifying the additional improvement in history matching quality per well. The developed parametrization and optimization algorithms and workflows revealed the unique features of each of the algorithms for model calibration and history matching. This integrated workflow has successfully defined and carried uncertainty throughout the history matching process. Following the successful field-level history match, the well-level history matching was conducted using streamline sensitivity-based inversion, which further improved the history match quality and conditioned the model to historical production and injection data. In general, the workflow results in enhanced history match quality in a shorter turnaround time. The geological realism of the model is retained for robust prediction and development planning.


2016 ◽  
Vol 19 (03) ◽  
pp. 391-402
Author(s):  
Sunday Amoyedo ◽  
Emmanuel Ekut ◽  
Rasaki Salami ◽  
Liliana Goncalves-Ferreira ◽  
Pascal Desegaulx

Summary This paper presents case studies focused on the interpretation and integration of seismic reservoir monitoring from several fields in conventional offshore and deepwater Niger Delta. The fields are characterized by different geological settings and development-maturity stages. We show different applications varying from qualitative to quantitative use of time-lapse (4D) seismic information. In the first case study, which is in shallow water, the field has specific reservoir-development challenges, simple geology, and is in phased development. On this field, 4D seismic, which was acquired several years ago, is characterized by poor seismic repeatability. Nevertheless, we show that because of improvements from seismic reprocessing, 4D seismic makes qualitative contributions to the ongoing field development. In the second case study, the field is characterized by complex geological settings. The 4D seismic is affected by overburden with strong lateral variations in velocity and steeply dipping structure (up to 40°). Prestack-depth-imaging (PSDM) 4D seismic is used in a more-qualitative manner to monitor gas injection, validate the geologic/reservoir models, optimize infill injector placement, and consequently, enhance field-development economics. The third case study presents a deep offshore field characterized by a complex depositional system for some reservoirs. In this example, good 4D-seismic repeatability (sum of source- and receiver-placement differences between surveys, dS+dR) is achieved, leading to an increased quantitative use of 4D monitoring for the assessment of sand/sand communication, mapping of oil/water (OWC) front, pressure evolution, and dynamic calibration of petro-elastic model (PEM), and also as a seismic-based production-logging tool. In addition, 4D seismic is used to update seismic interpretation, provide a better understanding of internal architecture of the reservoirs units, and, thereby, yield a more-robust reservoir model. The 4D seismic in this field is a key tool for field-development optimization and reservoir management. The last case study illustrates the need for seismic-feasibility studies to detect 4D responses related to production. In addition to assessing the impact of the field environment on the 4D- seismic signal, these studies also help in choosing the optimum seismic-survey type, design, and acquisition parameters. These studies would possibly lead to the adoption of new technologies such as broad-band streamer or nodes acquisition in the near future.


Sign in / Sign up

Export Citation Format

Share Document