scholarly journals Stimulated Reservoir Volume Characterization and Optimum Lateral Well Spacing Study of Two-Well Pad: Midland Basin Case Study

Geofluids ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-18
Author(s):  
Jaeyoung Park ◽  
Candra Janova

This paper introduces a flow simulation-based reservoir modeling study of a two-well pad with long production history and identical completion parameters in the Midland Basin. The study includes building geologic model, history matching, well performance prediction, and finding optimum lateral well spacing in terms of oil volume and economic metrics. The reservoir model was constructed based on a geologic model, integrating well logs, and core data near the target area. Next, a sensitivity analysis was performed on the reservoir simulation model to better understand influential parameters on simulation results. The following history matching was conducted with the satisfactory quality, less than 10% of global error, and after the model calibration ranges of history matching parameters have substantially reduced. The population-based history matching algorithm provides the ensemble of the history-matched model, and the top 50 history-matched models were selected to predict the range of Estimate Ultimate Recovery (EUR), showing that P50 of oil EUR is within the acceptable range of the deterministic EUR estimates. With the best history-matched model, we investigated lateral well spacing sensitivity of the pad in terms of the maximum recovery volume and economic benefit. The results show that, given the current completion design, the well spacing tighter than the current practice in the area is less effective regarding the oil volume recovery. However, economic metrics suggest that the additional monetary value can be realized with 150% of current development assumption. The presented workflow provides a systematic approach to find the optimum lateral well spacing in terms of volume and economic metrics per one section given economic assumptions, and the workflow can be readily repeated to evaluate spacing optimization in other acreage.

2021 ◽  
Author(s):  
Aymen Alhemdi ◽  
Ming Gu

Abstract Slickwater-sand fracturing design is widely employed in Marcellus shale. The slickwater- sand creates long skinny fractures and maximizes the stimulated reservoir volume (SRV). However, due to the fast settling of sand in the water, lots of upper and deeper areas are not sufficiently propped. Reducing sand size can lead to insufficient fracture conductivity. This study proposes to use three candidate ultra-lightweight proppants ULWPs to enhance the fractured well performance in unconventional reservoirs. In step 1, the current sand pumping design is input into an in-house P3D fracture propagation simulator to estimate the fracture geometry and proppant concentrations. Next, the distribution of proppant concentration converts to conductivity and then to fracture permeability. In the third step, the fracture permeability from the second step is input into a reservoir simulator to predict the cumulative production for history matching and calibration. In step 4, the three ULWPs are used to replace the sand in the frac simulator to get new frac geometry and conductivity distribution and then import them in reservoir model for production evaluation. Before this study, the three ULWPs have already been tested in the lab to obtain their long-term conductivities under in-situ stress conditions. The conductivity distribution and production performance are analyzed and investigated. The induced fracture size and location of the produced layer for the current target well play a fundamental effect on ultra-light proppant productivity. The average conductivity of ULWPs with mesh 40/70 is larger and symmetric along the fracture except for a few places. However, ULWPs with mesh 100 generates low average conductivity and create a peak conductivity in limited areas. The ULW-3 tends to have less cumulative production compared with the other ULWPs. For this Marcellus Shale study, the advantages of ultra-lightweight proppant are restricted and reduced because the upward fracture height growth is enormous. And with the presence of the hydrocarbon layer is at the bottom of the fracture, making a large proportion of ULWPs occupies areas that are not productive places. The current study provides a guidance for operators in Marcellus Shale to determine (1) If the ULWP can benefit the current shale well treated by sand, (2) what type of ULWP should be used, and (3) given a certain type of ULWP, what is the optimum pumping schedule and staging/perforating design to maximize the well productivity. The similar workflow can be expanded to evaluate the economic potential of different ULWPs in any other unconventional field.


2006 ◽  
Vol 9 (01) ◽  
pp. 15-23 ◽  
Author(s):  
Ajay K. Samantray ◽  
Qasem M. Dashti ◽  
Eddie Ma ◽  
Pradeep S. Kumar

Summary Nine multimillion-cell geostatistical earth models of the Marrat reservoir in Magwa field, Kuwait, were upscaled for streamline (SL) screening and finite-difference (FD) flow simulation. The scaleup strategy consisted of (1) maintaining square areal blocks over the oil column, (2) upscaling to the largest areal-block size (200 x 200 m) compatible with 125-acre well spacing, (3) upscaling to less than 1 million gridblocks for SL screening, and (4) upscaling to less than 250,000 gridblocks for FD flow simulation. Chevron's in-house scaleup software program, SCP, was used for scaleup. SCP employs a single-phase flow-based process for upscaling nonuniform 3D grids. Several iterations of scaleup were made to optimize the result. Sensitivity tests suggest that a uniform scaled-up grid overestimates breakthrough time compared to the fine model, and the post-breakthrough fractional flow also remains higher than in the fine model. However, preserving high-flow-rate layers in a nonuniform scaled-up model was key to matching the front-tracking behavior of the fine model. The scaled-up model was coarsened in areas of low average layer flow because less refinement is needed in these areas to still match the flow behavior of the fine model. The final ratio of pre- to post-scaleup grid sizes was 6:1 for SL and 21:1 for FD simulation. Several checks were made to verify the accuracy of scaleup. These include comparison of pre- and post-scaleup fractional-flow curves in terms of breakthrough time and post-breakthrough curve shape, cross-sectional permeabilities, global porosity histograms, porosity/permeability clouds, visual comparison of heterogeneity, and earth-model and scaled-up volumetrics. The scaled-up models were screened using the 3D SL technique. The results helped in bracketing the flow behavior of different earth models and evaluating the model that better tracks the historical performance data. By initiating the full-field history-matching process with the geologic model that most closely matched the field performance in the screening stage, the amount of history matching was minimized, and the time and effort required were reduced. The application of unrealistic changes to the geologic model to match production history was also avoided. The study suggests that single realizations of "best-guess" geostatistical models are not guaranteed to offer the best history match and performance prediction. Multiple earth models must be built to capture the range of heterogeneity and assess its impact on reservoir flow behavior. Introduction The widespread use of geostatistics during the last decade has offered us both opportunities and challenges. It has been possible to capture vertical and areal heterogeneities measured by well logs and inferred by the depositional environments in a very fine scale with 0.1- to 0.3-m vertical and 20- to 100-m areal resolution (Hobbet et al. 2000; Dashti et al. 2002; Aly et al. 1999; Haldorsen and Damsleth 1990; Haldorsen and Damsleth 1993). It also has been possible to generate a large number of realizations to assess the uncertainty in reservoir descriptions and performance predictions (Sharif and MacDonald 2001). These multiple realizations variously account for uncertainties in structure, stratigraphy, and petrophysical properties. Although impressive, the fine-scale geological models usually run into several millions of cells, and current computing technology limits us from simulating such multimillion-cell models on practical time scales. This requires a translation of the detailed grids to a coarser, computationally manageable level without compromising the gross flow behavior of the original fine-scale model and the anticipated reservoir performance. This translation is commonly referred to as upscaling (Christie 1996; Durlofsky et al. 1996; Chawathe and Taggart 2001; Ates et al. 2003). The other challenge is to quantify the uncertainty while keeping the number of realizations manageable. This requires identifying uncertainties with the greatest potential impact and arriving at an optimal combination to capture the extremes. Further, these models require a screening and ranking process to assess their relative ability to track historical field performance and to help minimize the number of models that can be considered for comprehensive flow simulations (Milliken et al. 2001; Samier et al. 2002; Chakravarty et al. 2000; Lolomari et al. 2000; Albertão et al. 2001; Baker et al. 2001; Ates et al. 2003). In some situations, often a single realization of the best-guess geostatistical model is carried forward for conventional flow simulation and uncertainties are quantified with parametric techniques such as Monte Carlo evaluations (Hobbet et al. 2000; Dashti et al. 2002). Using the case study of this Middle Eastern carbonate reservoir, the paper describes the upscaling, uncertainty management, and SL screening process used to arrive at a single reference model that optimally combines the uncertainties and provides the best history match and performance forecast from full-field flow simulation. Fig. 1 presents the details of the workflow used.


2008 ◽  
Vol 2008 ◽  
pp. 1-13 ◽  
Author(s):  
Tina Yu ◽  
Dave Wilkinson ◽  
Alexandre Castellini

Reservoir modeling is a critical step in the planning and development of oil fields. Before a reservoir model can be accepted for forecasting future production, the model has to be updated with historical production data. This process is called history matching. History matching requires computer flow simulation, which is very time-consuming. As a result, only a small number of simulation runs are conducted and the history-matching results are normally unsatisfactory. This is particularly evident when the reservoir has a long production history and the quality of production data is poor. The inadequacy of the history-matching results frequently leads to high uncertainty of production forecasting. To enhance the quality of the history-matching results and improve the confidence of production forecasts, we introduce a methodology using genetic programming (GP) to construct proxies for reservoir simulators. Acting as surrogates for the computer simulators, the “cheap” GP proxies can evaluate a large number (millions) of reservoir models within a very short time frame. With such a large sampling size, the reservoir history-matching results are more informative and the production forecasts are more reliable than those based on a small number of simulation models. We have developed a workflow which incorporates the two GP proxies into the history matching and production forecast process. Additionally, we conducted a case study to demonstrate the effectiveness of this approach. The study has revealed useful reservoir information and delivered more reliable production forecasts. All of these were accomplished without introducing new computer simulation runs.


Energies ◽  
2019 ◽  
Vol 12 (5) ◽  
pp. 932 ◽  
Author(s):  
Wei Yu ◽  
Xiaohu Hu ◽  
Malin Liu ◽  
Weihong Wang

The influence of complex natural fractures on multiple shale-gas well performance with varying well spacing is poorly understood. It is difficult to apply the traditional local grid refinement with structured or unstructured gridding techniques to accurately and efficiently handle complex natural fractures. In this study, we introduced a powerful non-intrusive embedded discrete fracture model (EDFM) technology to overcome the limitations of exiting methods. Through this unique technology, complex fracture configurations can be easily and explicitly embedded into structured matrix blocks. We set up a field-scale two-phase reservoir model to history match field production data and predict long-term recovery from Marcellus. The effective fracture properties were determined thorough history matching. In addition, we extended the single-well model to include two horizontal wells with and without including natural fractures. The effects of different numbers of natural fractures on two-well performance with varying well spacing of 200 m, 300 m, and 400 m were examined. The simulation results illustrate that gas productivity almost linearly increases with the number of two-set natural fractures. Furthermore, the difference of well performance between different well spacing increases with an increase in natural fracture density. A larger well spacing is preferred for economically developing the shale-gas reservoirs with a larger natural fracture density. The findings of this study provide key insights into understanding the effect of natural fractures on well performance and well spacing optimization.


2020 ◽  
Author(s):  
mickaele Le Ravalec ◽  
Véronique Gervais ◽  
Frédéric Roggero

<p>Production forecasting is part of the existence of the oil and gas industry: it contributes to generate improvements in operations.</p><p>A key tool to tackle this problem is the building of reservoir models that describe the properties of the underground hydrocarbon reservoirs. Clearly, the value of such models strongly depends on their abilities to accurately predict the displacements of fluids within reservoirs. This is the reason why it is essential that reservoir models reproduce at least the data already collected. Data-consistent models are more reliable.</p><p>The data considered are split into two groups: static and dynamic data. Static data do not vary with time. They include for instance measurements on core samples extracted from wells or logs used to describe electrofacies and petrophysical variations along wells. However, such direct measurements of geological and petrophysical properties are very sparse and sample only a small reservoir volume. They have to be supplemented by indirect measurements, mainly 3-D seismic. The second group of data includes dynamic data, i.e., data which vary with time because they depend on fluid flows. They mainly comprise production data, i.e., data measured at wells such as bottom hole pressures, oil production rates, gas-oil ratios, tracer concentrations, etc. Anyway, we end up with only little information about the spatial distributions of facies, porosity or permeability within the targeted hydrocarbon reservoirs. These facies/petrophysical properties can be considered as realizations of random functions. They are very specific because of two essential features: they include a huge number of unknown values and they have a spatial structure.</p><p>The purpose of reservoir modeling is to identify facies and petrophysical realizations that make it possible to numerically reproduce the dynamic data while still respecting the static ones. Different approaches can be envisioned.</p><p>A first possibility consists in randomly generating realizations, then in simulating fluid flow for each of them to see whether they reproduce or not the required data. The process is repeated until identifying a suitable set of facies/petrophysical realizations. The second approach is pretty close. The idea behind is still to screen the realization space, but without performing any fluid flow simulation to check the suitability of the realizations. This strongly depends on the definition of a meaningful criterion to characterize the dynamic behavior of the considered set of realizations without running flow simulations. We may also randomly generate a starting set of facies/petrophysical realizations and run an optimization process aiming to minimize an objective function by adjusting the realizations. A key issue is then how to simultaneously adjust so many parameters while preserving the consistency with respect to the static data. This motivated many research works over the last 20 years, resulting in the development of several parameterization techniques. One of the very first was the pilot point method introduced by de Marsily (1984). Since, variants and other parameterization techniques have been proposed. We aim to review some of them and focus on how useful they are depending on the problem to be faced.</p>


2021 ◽  
Author(s):  
Rohan Sakhardande ◽  
Deepak Devegowda

Abstract The analyses of parent-child well performance is a complex problem depending on the interplay between timing, completion design, formation properties, direct frac-hits and well spacing. Assessing the impact of well spacing on parent or child well performance is therefore challenging. A naïve approach that is purely observational does not control for completion design or formation properties and can compromise well spacing decisions and economics and perhaps, lead to non-intuitive results. By using concepts from causal inference in randomized clinical trials, we quantify the impact of well spacing decisions on parent and child well performance. The fundamental concept behind causal inference is that causality facilitates prediction; but being able to predict does not imply causality because of association between the variables. In this study, we work with a large dataset of over 3000 wells in a large oil-bearing province in Texas. The dataset includes several covariates such as completion design (proppant/fluid volumes, frac-stages, lateral length, cluster spacing, clusters/stage and others) and formation properties (mechanical and petrophysical properties) as well as downhole location. We evaluate the impact of well spacing on 6-month and 1-year cumulative oil in four groups associated with different ranges of parent-child spacing. By assessing the statistical balance between the covariates for both parent and child well groups (controlling for completion and formation properties), we estimate the causal impact of well spacing on parent and child well performance. We compare our analyses with the routine naïve approach that gives non-intuitive results. In each of the four groups associated with different ranges of parent-child well spacing, the causal workflow quantifies the production loss associated with the parent and child well. This degradation in performance is seen to decrease with increasing well spacing and we provide an optimal well spacing value for this specific multi-bench unconventional play that has been validated in the field. The naïve analyses based on simply assessing association or correlation, on the contrary, shows increasing child well degradation for increasing well spacing, which is simply not supported by the data. The routinely applied correlative analyses between the outcome (cumulative oil) and predictors (well spacing) fails simply because it does not control for variations in completion design over the years, nor does it account for variations in the formation properties. To our knowledge, there is no other paper in petroleum engineering literature that speaks of causal inference. This is a fundamental precept in medicine to assess drug efficacy by controlling for age, sex, habits and other covariates. The same workflow can easily be generalized to assess well spacing decisions and parent-child well performance across multi-generational completion designs and spatially variant formation properties.


2021 ◽  
Author(s):  
Mokhles Mezghani ◽  
Mustafa AlIbrahim ◽  
Majdi Baddourah

Abstract Reservoir simulation is a key tool for predicting the dynamic behavior of the reservoir and optimizing its development. Fine scale CPU demanding simulation grids are necessary to improve the accuracy of the simulation results. We propose a hybrid modeling approach to minimize the weight of the full physics model by dynamically building and updating an artificial intelligence (AI) based model. The AI model can be used to quickly mimic the full physics (FP) model. The methodology that we propose consists of starting with running the FP model, an associated AI model is systematically updated using the newly performed FP runs. Once the mismatch between the two models is below a predefined cutoff the FP model is switch off and only the AI model is used. The FP model is switched on at the end of the exercise either to confirm the AI model decision and stop the study or to reject this decision (high mismatch between FP and AI model) and upgrade the AI model. The proposed workflow was applied to a synthetic reservoir model, where the objective is to match the average reservoir pressure. For this study, to better account for reservoir heterogeneity, fine scale simulation grid (approximately 50 million cells) is necessary to improve the accuracy of the reservoir simulation results. Reservoir simulation using FP model and 1024 CPUs requires approximately 14 hours. During this history matching exercise, six parameters have been selected to be part of the optimization loop. Therefore, a Latin Hypercube Sampling (LHS) using seven FP runs is used to initiate the hybrid approach and build the first AI model. During history matching, only the AI model is used. At the convergence of the optimization loop, a final FP model run is performed either to confirm the convergence for the FP model or to re iterate the same approach starting from the LHS around the converged solution. The following AI model will be updated using all the FP simulations done in the study. This approach allows the achievement of the history matching with very acceptable quality match, however with much less computational resources and CPU time. CPU intensive, multimillion-cell simulation models are commonly utilized in reservoir development. Completing a reservoir study in acceptable timeframe is a real challenge for such a situation. The development of new concepts/techniques is a real need to successfully complete a reservoir study. The hybrid approach that we are proposing is showing very promising results to handle such a challenge.


Author(s):  
Елизавета Алексеевна Тихомирова

Одним из важнейших свойств залежи является нефтенасыщенность. При моделировании распределение нефтенасыщенности является одним из основных параметров для подсчета запасов и дальнейшего гидродинамического моделирования. В статье рассмотрены варианты построения куба нефтенасыщенности с учетом априорной информации в виде данных результатов интерпретации геофизических исследований скважины, капилляриметрических испытаний и 3Д-трендов, а также алгоритмы реализации описанных методов в программной среде IRAP RMS. Построены геологические модели по описанным методам. Proper distribution of oil content is one of the main parameters for reserves assessment and further flow simulation. The paper reveals the variety of approaches to oil saturation cube building based on well log interpretation data, capillarimetry and 3D-trends and also the algorithms of the realization of these methods in reservoir modeling software IRAP RMS. Geological models are built using the described approaches.


Sign in / Sign up

Export Citation Format

Share Document