Integrated Assisted History Matching and Forecast Optimisation Under Uncertainty for More Robust Mature Field Redevelopment Project

2021 ◽  
Author(s):  
Simon Berry ◽  
Zahid Khan ◽  
Diego Corbo ◽  
Tom Marsh ◽  
Alexandra Kidd ◽  
...  

Abstract Redevelopment of a mature field enables reassessment of the current field understanding to maximise its economic return. However, the redevelopment process is associated with several challenges: 1) analysis of large data sets is a time-consuming process, 2) extrapolation of the existing data on new areas is associated with significant uncertainties, 3) screening multiple potential scenarios can be tedious. Traditional workflows have not combatted these challenges in an efficient manner. In this work, we suggest an integrated approach to combine static and dynamic uncertainties to streamline evaluating of multiple possible scenarios is adopted, while quantifying the associated uncertainties to improve reservoir history matching and forecasting. The creation of a fully integrated automated workflow which includes geological and fluid models is used to perform Assisted History Matching (AHM) that allows the screening of different parameter combinations whilst also calibrating to the historical data. An ensemble of history matched models is then selected using dimensionality reduction and clustering techniques. The selected ensemble is used for reservoir predictions and represents a spread of possible solutions accounting for uncertainty. Finally, well location optimisation under uncertainty is performed to find the optimal well location for multiple equiprobable scenarios simultaneously. The suggested workflow was applied to the Northern Area Claymore (NAC) field. NAC is a structurally complex, Lower Cretaceous stacked turbidite, composed of three reservoirs, which have produced ~170 MMbbls of oil since 1978 from an estimated STOIIP of ~500 MMstb. The integrated workflow helps to streamline the redevelopment project by allowing geoscientists and engineers to work together, account for multiple scenarios and quantify the associated uncertainties. Working with static and dynamic variables simultaneously helps to get a better insight into how different properties and property combinations can help to achieve a history match. Using powerful hardware, cloud-computing and fully parallel software allow to evaluate a range of possible solutions and work with an ensemble of equally probable matched models. As an ultimate outcome of the redevelopment project, several prediction profiles have been produced in a time-efficient manner, aiming to improve field recovery and accounting for the associated uncertainty. The current project shows the value of the integrated approach applied to a real case to overcome the shortcomings of the traditional approach. The collaboration of experts with different backgrounds in a common project permits the assessment of multiple hypotheses in an efficient manner and helps to get a deeper understanding of the reservoir. Finally, the project provides evidence that working with an ensemble of models allows to evaluate a range of possible solutions and account for potential risks, providing more robust predictions for future field redevelopment.

2002 ◽  
Vol 44 (6) ◽  
Author(s):  
Frank Reck ◽  
Günther Greiner

Particle tracing is a widely used method to analyze and interpret results of a flow simulation. In addition, it is a preliminary step for more advanced techniques of flow visualization, e.g. line integral convolution.For interactive exploration of large data sets, a very efficient and reliable particle tracing method is needed. For data on unstructured grids and data sizes, as they appear in the simulation of wind channel experiments (e.g. automative industry) and flight simulation (e.g. aircraft industry), the traditional approach, based on numerical integration methods of ordinary differential equations does not allow sufficiently accurate path calculation at the speed required for interactive use.Traditional integration techniques require small stepsizes in order to achieve sufficient accuracy. T his results in many cell search operations, which means especially for unstructured grids a bottleneck of the whole procedure.In [5] Nielson and Jung have proposed a new method, called further on locally exact method, which gives sufficient accuracy and traverses each cell in a single step. In this note we extend the approach of Nielson and Jung in such a way that it can be performed in real time. This will be achieved by a sophisticated preprocessing, which allows a fast execution of the locally exact integration method, interactive particle tracing in large data sets can be done. We describe the procedure, compare it with Nielson´s original approach, as well as with the traditional method based on numerical integration and report on the performance of the different methods.


SPE Journal ◽  
2019 ◽  
Vol 24 (04) ◽  
pp. 1452-1467 ◽  
Author(s):  
Rolf J. Lorentzen ◽  
Xiaodong Luo ◽  
Tuhin Bhakta ◽  
Randi Valestrand

Summary In this paper, we use a combination of acoustic impedance and production data for history matching the full Norne Field. The purpose of the paper is to illustrate a robust and flexible work flow for assisted history matching of large data sets. We apply an iterative ensemble-based smoother, and the traditional approach for assisted history matching is extended to include updates of additional parameters representing rock clay content, which has a significant effect on seismic data. Further, for seismic data it is a challenge to properly specify the measurement noise, because the noise level and spatial correlation between measurement noise are unknown. For this purpose, we apply a method based on image denoising for estimating the spatially correlated (colored) noise level in the data. For the best possible evaluation of the workflow performance, all data are synthetically generated in this study. We assimilate production data and seismic data sequentially. First, the production data are assimilated using traditional distance-based localization, and the resulting ensemble of reservoir models is then used when assimilating seismic data. This procedure is suitable for real field applications, because production data are usually available before seismic data. If both production data and seismic data are assimilated simultaneously, the high number of seismic data might dominate the overall history-matching performance. The noise estimation for seismic data involves transforming the observations to a discrete wavelet domain. However, the resulting data do not have a clear spatial position, and the traditional distance-based localization schemes used to avoid spurious correlations and underestimated uncertainty (because of limited ensemble size), are not possible to apply. Instead, we use a localization scheme that is based on correlations between observations and parameters that does not rely on physical position for model variables or data. This method automatically adapts to each observation and iteration. The results show that we reduce data mismatch for both production and seismic data, and that the use of seismic data reduces estimation errors for porosity, permeability, and net-to-gross ratio (NTG). Such improvements can provide useful information for reservoir management and planning for additional drainage strategies.


Author(s):  
John A. Hunt

Spectrum-imaging is a useful technique for comparing different processing methods on very large data sets which are identical for each method. This paper is concerned with comparing methods of electron energy-loss spectroscopy (EELS) quantitative analysis on the Al-Li system. The spectrum-image analyzed here was obtained from an Al-10at%Li foil aged to produce δ' precipitates that can span the foil thickness. Two 1024 channel EELS spectra offset in energy by 1 eV were recorded and stored at each pixel in the 80x80 spectrum-image (25 Mbytes). An energy range of 39-89eV (20 channels/eV) are represented. During processing the spectra are either subtracted to create an artifact corrected difference spectrum, or the energy offset is numerically removed and the spectra are added to create a normal spectrum. The spectrum-images are processed into 2D floating-point images using methods and software described in [1].


Author(s):  
Thomas W. Shattuck ◽  
James R. Anderson ◽  
Neil W. Tindale ◽  
Peter R. Buseck

Individual particle analysis involves the study of tens of thousands of particles using automated scanning electron microscopy and elemental analysis by energy-dispersive, x-ray emission spectroscopy (EDS). EDS produces large data sets that must be analyzed using multi-variate statistical techniques. A complete study uses cluster analysis, discriminant analysis, and factor or principal components analysis (PCA). The three techniques are used in the study of particles sampled during the FeLine cruise to the mid-Pacific ocean in the summer of 1990. The mid-Pacific aerosol provides information on long range particle transport, iron deposition, sea salt ageing, and halogen chemistry.Aerosol particle data sets suffer from a number of difficulties for pattern recognition using cluster analysis. There is a great disparity in the number of observations per cluster and the range of the variables in each cluster. The variables are not normally distributed, they are subject to considerable experimental error, and many values are zero, because of finite detection limits. Many of the clusters show considerable overlap, because of natural variability, agglomeration, and chemical reactivity.


2014 ◽  
Author(s):  
G. A. Carvajal ◽  
M. Maucec ◽  
A. Singh ◽  
A. Mahajan ◽  
J. Dhar ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document