Dynamic data integration preserving the properties of the fine-scale geostatistical model and its link with the fluid flow simulation model

Author(s):  
T. Schaaf
SPE Journal ◽  
2008 ◽  
Vol 13 (01) ◽  
pp. 99-111 ◽  
Author(s):  
Vegard R. Stenerud ◽  
Vegard Kippe ◽  
Knut-Andreas Lie ◽  
Akhil Datta-Gupta

Summary A particularly efficient reservoir simulator can be obtained by combining a recent multiscale mixed finite-element flow solver with a streamline method for computing fluid transport. This multiscale-streamline method has shown to be a promising approach for fast flow simulations on high-resolution geologic models with multimillion grid cells. The multiscale method solves the pressure equation on a coarse grid while preserving important fine-scale details in the velocity field. Fine-scale heterogeneity is accounted for through a set of generalized, heterogeneous basis functions that are computed numerically by solving local flow problems. When included in the coarse-grid equations, the basis functions ensure that the global equations are consistent with the local properties of the underlying differential operators. The multiscale method offers a substantial gain in computation speed, without significant loss of accuracy, when basis functions are updated infrequently throughout a dynamic simulation. In this paper, we propose to combine the multiscale-streamline method with a recent "generalized travel-time inversion" method to derive a fast and robust method for history matching high-resolution geocellular models. A key point in the new method is the use of sensitivities that are calculated analytically along streamlines with little computational overhead. The sensitivities are used in the travel-time inversion formulation to give a robust quasilinear method that typically converges in a few iterations and generally avoids much of the time-consuming trial-and-error seen in manual history matching. Moreover, the sensitivities are used to enforce basis functions to be adaptively updated only in areas with relatively large sensitivity to the production response. The sensitivity-based adaptive approach allows us to selectively update only a fraction of the total number of basis functions, which gives substantial savings in computation time for the forward flow simulations. We demonstrate the power and utility of our approach using a simple 2D model and a highly detailed 3D geomodel. The 3D simulation model consists of more than 1,000,000 cells with 69 producing wells. Using our proposed approach, history matching over a period of 7 years is accomplished in less than 20 minutes on an ordinary workstation PC. Introduction It is well known that geomodels derived from static data only—such as geological, seismic, well-log, and core data—often fail to reproduce the production history. Reconciling geomodels to the dynamic response of the reservoir is critical for building reliable reservoir models. In the past few years, there have been significant developments in the area of dynamic data integration through the use of inverse modeling. Streamline methods have shown great promise in this regard (Vasco et al. 1999; Wang and Kovscek 2000; Milliken et al. 2001; He et al. 2002; Al-Harbi et al. 2005; Cheng et al. 2006). Streamline-based methods have the advantages that they are highly efficient "forward" simulators and allow production-response sensitivities to be computed analytically using a single flow simulation (Vasco et al. 1999; He et al. 2002; Al-Harbi et al. 2005; Cheng et al. 2006). Sensitivities describe the change in production responses caused by small perturbations in reservoir properties such as porosity and permeability and are a vital part of many methods for integrating dynamic data. Even though streamline simulators provide fast forward simulation compared with a full finite-difference simulation in 3D, the forward simulation is still the most time-consuming part of the history-matching process. A streamline simulation consists of two steps that are repeated:solution of a 3D pressure equation to compute flow velocities; andsolution of 1D transport equations for evolving fluid compositions along representative sets of streamlines, followed by a mapping back to the underlying pressure grid. The first step is referred to as the "pressure step" and is often the most time-consuming. Consequently, history matching and flow simulation are usually performed on upscaled simulation models, which imposes the need for a subsequent downscaling if the dynamic data are to be integrated in the geomodel. Upscaling and downscaling may result in loss of important fine-scale information.


2020 ◽  
Author(s):  
mickaele Le Ravalec ◽  
Véronique Gervais ◽  
Frédéric Roggero

<p>Production forecasting is part of the existence of the oil and gas industry: it contributes to generate improvements in operations.</p><p>A key tool to tackle this problem is the building of reservoir models that describe the properties of the underground hydrocarbon reservoirs. Clearly, the value of such models strongly depends on their abilities to accurately predict the displacements of fluids within reservoirs. This is the reason why it is essential that reservoir models reproduce at least the data already collected. Data-consistent models are more reliable.</p><p>The data considered are split into two groups: static and dynamic data. Static data do not vary with time. They include for instance measurements on core samples extracted from wells or logs used to describe electrofacies and petrophysical variations along wells. However, such direct measurements of geological and petrophysical properties are very sparse and sample only a small reservoir volume. They have to be supplemented by indirect measurements, mainly 3-D seismic. The second group of data includes dynamic data, i.e., data which vary with time because they depend on fluid flows. They mainly comprise production data, i.e., data measured at wells such as bottom hole pressures, oil production rates, gas-oil ratios, tracer concentrations, etc. Anyway, we end up with only little information about the spatial distributions of facies, porosity or permeability within the targeted hydrocarbon reservoirs. These facies/petrophysical properties can be considered as realizations of random functions. They are very specific because of two essential features: they include a huge number of unknown values and they have a spatial structure.</p><p>The purpose of reservoir modeling is to identify facies and petrophysical realizations that make it possible to numerically reproduce the dynamic data while still respecting the static ones. Different approaches can be envisioned.</p><p>A first possibility consists in randomly generating realizations, then in simulating fluid flow for each of them to see whether they reproduce or not the required data. The process is repeated until identifying a suitable set of facies/petrophysical realizations. The second approach is pretty close. The idea behind is still to screen the realization space, but without performing any fluid flow simulation to check the suitability of the realizations. This strongly depends on the definition of a meaningful criterion to characterize the dynamic behavior of the considered set of realizations without running flow simulations. We may also randomly generate a starting set of facies/petrophysical realizations and run an optimization process aiming to minimize an objective function by adjusting the realizations. A key issue is then how to simultaneously adjust so many parameters while preserving the consistency with respect to the static data. This motivated many research works over the last 20 years, resulting in the development of several parameterization techniques. One of the very first was the pilot point method introduced by de Marsily (1984). Since, variants and other parameterization techniques have been proposed. We aim to review some of them and focus on how useful they are depending on the problem to be faced.</p>


2012 ◽  
Vol 43 (1-2) ◽  
pp. 54-63 ◽  
Author(s):  
Baohong Lu ◽  
Huanghe Gu ◽  
Ziyin Xie ◽  
Jiufu Liu ◽  
Lejun Ma ◽  
...  

Stochastic simulation is widely applied for estimating the design flood of various hydrosystems. The design flood at a reservoir site should consider the impact of upstream reservoirs, along with any development of hydropower. This paper investigates and applies a stochastic simulation approach for determining the design flood of a complex cascade of reservoirs in the Longtan watershed, southern China. The magnitude of the design flood when the impact of the upstream reservoirs is considered is less than that without considering them. In particular, the stochastic simulation model takes into account both systematic and historical flood records. As the reliability of the frequency analysis increases with more representative samples, it is desirable to incorporate historical flood records, if available, into the stochastic simulation model. This study shows that the design values from the stochastic simulation method with historical flood records are higher than those without historical flood records. The paper demonstrates the advantages of adopting a stochastic flow simulation approach to address design-flood-related issues for a complex cascade reservoir system.


Author(s):  
Seyed Kourosh Mahjour ◽  
Antonio Alberto Souza Santos ◽  
Manuel Gomes Correia ◽  
Denis José Schiozer

AbstractThe simulation process under uncertainty needs numerous reservoir models that can be very time-consuming. Hence, selecting representative models (RMs) that show the uncertainty space of the full ensemble is required. In this work, we compare two scenario reduction techniques: (1) Distance-based Clustering with Simple Matching Coefficient (DCSMC) applied before the simulation process using reservoir static data, and (2) metaheuristic algorithm (RMFinder technique) applied after the simulation process using reservoir dynamic data. We use these two methods as samples to investigate the effect of static and dynamic data usage on the accuracy and rate of the scenario reduction process focusing field development purposes. In this work, a synthetic benchmark case named UNISIM-II-D considering the flow unit modelling is used. The results showed both scenario reduction methods are reliable in selecting the RMs from a specific production strategy. However, the obtained RMs from a defined strategy using the DCSMC method can be applied to other strategies preserving the representativeness of the models, while the role of the strategy types to select the RMs using the metaheuristic method is substantial so that each strategy has its own set of RMs. Due to the field development workflow in which the metaheuristic algorithm is used, the number of required flow simulation models and the computational time are greater than the workflow in which the DCSMC method is applied. Hence, it can be concluded that static reservoir data usage on the scenario reduction process can be more reliable during the field development phase.


IEEE Access ◽  
2021 ◽  
Vol 9 ◽  
pp. 31236-31245
Author(s):  
Luis Burbano ◽  
Luis Francisco Combita ◽  
Nicanor Quijano ◽  
Sandra Rueda

Geofluids ◽  
2019 ◽  
Vol 2019 ◽  
pp. 1-19 ◽  
Author(s):  
Miller Zambrano ◽  
Alan D. Pitts ◽  
Ali Salama ◽  
Tiziano Volatili ◽  
Maurizio Giorgioni ◽  
...  

Fluid flow through a single fracture is traditionally described by the cubic law, which is derived from the Navier-Stokes equation for the flow of an incompressible fluid between two smooth-parallel plates. Thus, the permeability of a single fracture depends only on the so-called hydraulic aperture which differs from the mechanical aperture (separation between the two fracture wall surfaces). This difference is mainly related to the roughness of the fracture walls, which has been evaluated in previous works by including a friction factor in the permeability equation or directly deriving the hydraulic aperture. However, these methodologies may lack adequate precision to provide valid results. This work presents a complete protocol for fracture surface mapping, roughness evaluation, fracture modeling, fluid flow simulation, and permeability estimation of individual fracture (open or sheared joint/pressure solution seam). The methodology includes laboratory-based high-resolution structure from motion (SfM) photogrammetry of fracture surfaces, power spectral density (PSD) surface evaluation, synthetic fracture modeling, and fluid flow simulation using the Lattice-Boltzmann method. This work evaluates the respective controls on permeability exerted by the fracture displacement (perpendicular and parallel to the fracture walls), surface roughness, and surface pair mismatch. The results may contribute to defining a more accurate equation of hydraulic aperture and permeability of single fractures, which represents a pillar for the modeling and upscaling of the hydraulic properties of a geofluid reservoir.


1973 ◽  
Author(s):  
Richard C. Grinold ◽  
Robert M. Oliver

Sign in / Sign up

Export Citation Format

Share Document