Reservoir Description by Integration of Well Test Data and Spatial Statistics

1995 ◽  
Vol 10 (04) ◽  
pp. 267-274 ◽  
Author(s):  
R.K. Sagar ◽  
M.G. Kelkar ◽  
L.G. Thompson
2000 ◽  
Vol 3 (04) ◽  
pp. 325-334 ◽  
Author(s):  
J.L. Landa ◽  
R.N. Horne ◽  
M.M. Kamal ◽  
C.D. Jenkins

Summary In this paper we present a method to integrate well test, production, shut-in pressure, log, core, and geological data to obtain a reservoir description for the Pagerungan field, offshore Indonesia. The method computes spatial distributions of permeability and porosity and generates a pressure response for comparison to field data. This technique produced a good match with well-test data from three wells and seven shut-in pressures. The permeability and porosity distributions also provide a reasonable explanation of the observed effects of a nearby aquifer on individual wells. As a final step, the method is compared to an alternate technique (object modeling) that models the reservoir as a two-dimensional channel. Introduction The Pagerungan field has been under commercial production since 1994. This field was chosen to test a method of integrating dynamic well data and reservoir description data because the reservoir has only produced single phase gas, one zone in the reservoir is responsible for most of the production, and good quality well-test, core, and log data are available for most wells. The method that was used to perform the inversion of the spatial distribution of permeability and porosity uses a parameter estimation technique that calculates the gradients of the calculated reservoir pressure response with respect to the permeability and porosity in each of the cells of a reservoir simulation grid. The method is a derivative of the gradient simulator1 approach and is described in Appendices A and B. The objective is to find sets of distributions of permeability and porosity such that the calculated response of the reservoir closely matches the pressure measurements. In addition, the distributions of permeability and porosity must satisfy certain constraints given by the geological model and by other information known about the reservoir. Statement of Theory and Definitions The process of obtaining a reservoir description involves using a great amount of data from different sources. It is generally agreed that a reservoir description will be more complete and reliable when it is the outcome of a process that can use the maximum possible number of data from different sources. This is usually referred to in the literature as "data Integration." Reservoir data can be classified as "static" or "dynamic" depending on their connection to the movement or flow of fluids in the reservoir. Data that have originated from geology, logs, core analysis, seismic and geostatistics can be generally classified as static; whereas the information originating from well testing and the production performance of the reservoir can be classified as dynamic. So far, most of the success in data integration has been obtained with static information. Remarkably, it has not yet become common to completely or systematically integrate dynamic data with static data. A number of researchers,2–5 are studying this problem at present. This work represents one step in that direction. Well Testing as a Tool for Reservoir Description. Traditional well-test analysis provides good insight into the average properties of the reservoir in the vicinity of a well. Well testing can also identify the major features of relatively simple reservoirs, such as faults, fractures, double porosity, channels, pinchouts, etc. in the near well area. The difficulties with this approach begin when it is necessary to use the well-test data on a larger scale, such as in the context of obtaining a reservoir description. One of the main reasons for these difficulties is that traditional well-test analysis handles transient pressure data collected at a single well at a time, and is restricted to a small time range. As a result, traditional well-test analysis does not make use of "pressure" events separated in historical time. The use of several single and multiple well tests to describe reservoir heterogeneity has been reported in the literature,6 however, this approach is not applied commonly because of the extensive efforts needed to obtain a reservoir description. The method presented in this paper uses a numerical model of the reservoir to overcome these shortcomings. It will be shown that pressure transients can be used effectively to infer reservoir properties at the scale of reservoir description. Well-test data, both complete tests and occasional spot pressure measurements, will be used to this effect. The well-test information allows us to infer properties close to the wells and, when combined with the shut-in pressures (spot pressure), boundary information and permeability-porosity correlations, provides the larger scale description. General Description of the Method The proposed method is similar to other parameter estimation methods and thus consists of the following major items: the mathematical model, the objective function and the minimization algorithm. Mathematical Model. Because of the complexity of the reservoir description, the reservoir response must be computed numerically. Therefore, the pressure response is found using a numerical simulator. The reservoir is discretized into blocks. The objective is to find a suitable permeability-porosity distribution so that values of these parameters can be assigned to each of the blocks.


SPE Journal ◽  
1996 ◽  
Vol 1 (02) ◽  
pp. 145-154 ◽  
Author(s):  
Dean S. Oliver

2021 ◽  
Author(s):  
Mohamad Mustaqim Mokhlis ◽  
Nurdini Alya Hazali ◽  
Muhammad Firdaus Hassan ◽  
Mohd Hafiz Hashim ◽  
Afzan Nizam Jamaludin ◽  
...  

Abstract In this paper we will present a process streamlined for well-test validation that involves data integration between different database systems, incorporated with well models, and how the process can leverage real-time data to present a full scope of well-test analysis to enhance the capability for assessing well-test performance. The workflow process demonstrates an intuitive and effective way for analyzing and validating a production well test via an interactive digital visualization. This approach has elevated the quality and integrity of the well-test data, as well as improved the process cycle efficiency that complements the field surveillance engineers to keep track of well-test compliance guidelines through efficient well-test tracking in the digital interface. The workflow process involves five primary steps, which all are conducted via a digital platform: Well Test Compliance: Planning and executing the well test Data management and integration Well Test Analysis and Validation: Verification of the well test through historical trending, stability period checks, and well model analysis Model validation: Correcting the well test and calibrating the well model before finalizing the validity of the well test Well Test Re-testing: Submitting the rejected well test for retesting and final step Integrating with corporate database system for production allocation This business process brings improvement to the quality of the well test, which subsequently lifts the petroleum engineers’ confidence level to analyze well performance and deliver accurate well-production forecasting. A well-test validation workflow in a digital ecosystem helps to streamline the flow of data and system integration, as well as the way engineers assess and validate well-test data, which results in minimizing errors and increases overall work efficiency.


2021 ◽  
Author(s):  
Nagaraju Reddicharla ◽  
Subba Ramarao Rachapudi ◽  
Indra Utama ◽  
Furqan Ahmed Khan ◽  
Prabhker Reddy Vanam ◽  
...  

Abstract Well testing is one of the vital process as part of reservoir performance monitoring. As field matures with increase in number of well stock, testing becomes tedious job in terms of resources (MPFM and test separators) and this affect the production quota delivery. In addition, the test data validation and approval follow a business process that needs up to 10 days before to accept or reject the well tests. The volume of well tests conducted were almost 10,000 and out of them around 10 To 15 % of tests were rejected statistically per year. The objective of the paper is to develop a methodology to reduce well test rejections and timely raising the flag for operator intervention to recommence the well test. This case study was applied in a mature field, which is producing for 40 years that has good volume of historical well test data is available. This paper discusses the development of a data driven Well test data analyzer and Optimizer supported by artificial intelligence (AI) for wells being tested using MPFM in two staged approach. The motivating idea is to ingest historical, real-time data, well model performance curve and prescribe the quality of the well test data to provide flag to operator on real time. The ML prediction results helps testing operations and can reduce the test acceptance turnaround timing drastically from 10 days to hours. In Second layer, an unsupervised model with historical data is helping to identify the parameters that affecting for rejection of the well test example duration of testing, choke size, GOR etc. The outcome from the modeling will be incorporated in updating the well test procedure and testing Philosophy. This approach is being under evaluation stage in one of the asset in ADNOC Onshore. The results are expected to be reducing the well test rejection by at least 5 % that further optimize the resources required and improve the back allocation process. Furthermore, real time flagging of the test Quality will help in reduction of validation cycle from 10 days hours to improve the well testing cycle process. This methodology improves integrated reservoir management compliance of well testing requirements in asset where resources are limited. This methodology is envisioned to be integrated with full field digital oil field Implementation. This is a novel approach to apply machine learning and artificial intelligence application to well testing. It maximizes the utilization of real-time data for creating advisory system that improve test data quality monitoring and timely decision-making to reduce the well test rejection.


2008 ◽  
Author(s):  
Danila Gulyaev ◽  
Andrey Ivanovich Ipatov ◽  
Nataliya Chernoglazova ◽  
Maxim Fedoseev

2021 ◽  
Vol 134 (3) ◽  
pp. 35-38
Author(s):  
A. M. Svalov ◽  

Horner’s traditional method of processing well test data can be improved by a special transformation of the pressure curves, which reduces the time the converted curves reach the asymptotic regimes necessary for processing these data. In this case, to take into account the action of the «skin factor» and the effect of the wellbore, it is necessary to use a more complete asymptotic expansion of the exact solution of the conductivity equation at large values of time. At the same time, this method does not allow to completely eliminate the influence of the wellbore, since the used asymptotic expansion of the solution for small values of time is limited by the existence of a singular point, in the vicinity of which the asymptotic expansion ceases to be valid. To solve this problem, a new method of processing well test data is proposed, which allows completely eliminating the influence of the wellbore. The method is based on the introduction of a modified inflow function to the well, which includes a component of the boundary condition corresponding to the influence of the wellbore.


Sign in / Sign up

Export Citation Format

Share Document