Characterization of Arbuckle-basement wastewater disposal system, Payne County, Northern Oklahoma

2019 ◽  
Vol 7 (4) ◽  
pp. SL19-SL36
Author(s):  
Gabriel L. Machado ◽  
Garrett J. Hickman ◽  
Maulin P. Gogri ◽  
Kurt J. Marfurt ◽  
Matthew J. Pranter ◽  
...  

Over the past eight years, north-central Oklahoma has experienced a significant increase in seismicity. Although the disposal of large volumes of wastewater into the Arbuckle Group basement system has been statistically correlated to this increased seismicity, our understanding of the actual mechanisms involved is somewhat superficial. To address this shortcoming, we initiated an integrated study to characterize and model the Arbuckle-basement system to increase our understanding of the subsurface dynamics during the wastewater-disposal process. We constructed a 3D geologic model that integrates 3D seismic data, well logs, core measurements, and injection data. Poststack-data conditioning and seismic attributes provided images of faults and the rugose top of the basement, whereas a modified-Hall analysis provided insights into the injection behavior of the wells. Using a Pareto-based history-matching technique, we calibrated the 3D models using the injection rate and pressure data. The history-matching process showed the dominant parameters to be formation-water properties, permeability, porosity, and horizontal anisotropy of the Arbuckle Group. Based on the pressure buildup responses from the calibrated models, we identified sealing and conductive characteristics of the key faults. Our analysis indicates the average porosity and permeability of Arbuckle Group to be approximately 7% and 10 mD, respectively. The simulation models also showed pockets of nonuniform and large pressure buildups in these formations, indicating that faults play an important role in fluid movement within the Arbuckle Group basement system. As one of the first integrated investigations conducted to understand the potential hydraulic coupling between the Arbuckle Group and the underlying basement, we evaluate the need for improved data recording and additional data collection. In particular, we recommend that operators wishing to pursue this type of analysis record their injection data on a daily rather than on an averaged basis. A more quantitative estimation of reservoir properties requires the acquisition of P-wave and dipole sonic logs in addition to the commonly acquired triple-combo logs. Finally, to better quantify flow units with the disposal reservoir, we recommend that operators acquire sufficient core to characterize the reservoir heterogeneity.

2021 ◽  
Author(s):  
Thomas J. Hampton ◽  
Mohamed El-Mandouh ◽  
Stevan Weber ◽  
Tirth Thaker ◽  
K.. Patel ◽  
...  

Abstract Mathematical Models are needed to aid in defining, analyzing, and quantifying solutions to design and manage steam floods. This paper discusses two main modeling methods – analytical and numerical simulation. Decisions as to which method to use and when to use them, requires an understanding of assumptions used, strengths, and limitations of each method. This paper presents advantages and disadvantages through comparison of analytical vs simulation when reservoir characterization becomes progressively more complex (dip, layering, heterogeneity between injector/producer, and reservoir thickness).While there are many analytical models, three analytical models are used for this paper:Marx & Langenheim, Modified Neuman, and Jeff Jones.The simulator used was CMG Stars on single pattern on both 5 Spot and 9 Spot patterns and Case 6 of 9 patterns, 5-Spot. Results were obtained using 6 different cases of varying reservoir properties based on Marx & Langenheim, Modified Neuman, and Jeff Jones models.Simulation was also done on each of the 6 cases, using Modified Neuman steam rates and then on Jeff Jones Steam rates using 9-Spot and 5-Spot patterns.This was done on predictive basis on inputs provided, without adjusting or history matching on analog or historical performance.Optimization runs using Particle Swarm Optimization was applied on one case in minimizing SOR and maximize NPV. Conclusion from comparing cases is that simulation is needed for complex geology, heterogeneity, and changes in layering. Also, simulation can be used for maximizing economics using AI based optimization tool. While understanding limitations, the analytical models are good for quick looks such as screening, scoping design, some surveillance, and for conceptual understanding of basic steam flood on uniform geologic properties. This paper is innovative in comparison of analytical models and simulation modeling.Results that quantify differences of oil rate, SOR, and injection rates (Neuman and Jeff Jones) impact on recovery factors is presented.


Author(s):  
Margarita A. Smetkina ◽  
◽  
Oleg A. Melkishev ◽  
Maksim A. Prisyazhnyuk ◽  
◽  
...  

Reservoir simulation models are used to design oil field developments, estimate efficiency of geological and engineering operations and perform prediction calculations of long-term development performances. A method has been developed to adjust the permeability cube values during reservoir model history-matching subject to the corederived dependence between rock petrophysical properties. The method was implemented using an example of the Bobrikovian formation (terrigenous reservoir) deposit of a field in the Solikamskian depression. A statistical analysis of the Bobrikovian formation porosity and permeability properties was conducted following the well logging results interpretation and reservoir modelling data. We analysed differences between the initial permeability obtained after upscaling the geological model and permeability obtained after the reservoir model history-matching. The analysis revealed divergences between the statistical characteristics of the permeability values based on the well logging data interpretation and the reservoir model, as well as substantial differences between the adjusted and initial permeability cubes. It was established that the initial permeability was significantly modified by manual adjustments in the process of history-matching. Extreme permeability values were defined and corrected based on the core-derived petrophysical dependence KPR = f(KP) , subject to ranges of porosity and permeability ratios. By using the modified permeability cube, calculations were performed to reproduce the formation production history. According to the calculation results, we achieved convergence with the actual data, while deviations were in line with the accuracy requirements to the model history-matching. Thus, this method of the permeability cube adjustment following the manual history-matching will save from the gross overestimation or underestimation of permeability in reservoir model cells.


2020 ◽  
pp. 3252-3265
Author(s):  
Nagham Jasim ◽  
Sameera M. Hamd-Allah ◽  
Hazim Abass

Increasing hydrocarbon recovery from tight reservoirs is an essential goal of oil industry in the recent years. Building real dynamic simulation models and selecting and designing suitable development strategies for such reservoirs need basically to construct accurate structural static model construction. The uncertainties in building 3-D reservoir models are a real challenge for such micro to nano pore scale structure. Based on data from 24 wells distributed throughout the Sadi tight formation. An application of building a 3-D static model for a tight limestone oil reservoir in Iraq is presented in this study. The most common uncertainties confronted while building the model were illustrated. Such as accurate estimations of cut-off permeability and porosity values. These values directly affect the calculation of net pay thickness for each layer in the reservoir and consequently affect the target of estimating reservoir initial oil in place (IOIP). Also, the main challenge to the static modeling of such reservoirs is dealing with tight reservoir characteristics which cause major reservoir heterogeneity and complexities that are problematic to the process of modeling reservoir simulation. Twenty seven porosity and permeability measurements from Sadi/Tanuma reservoir were used to validate log interpretation data for model construction. The results of the history matching process of the constructed dynamic model is also presented in this paper, including data related to oil production, reservoir pressure, and well flowing pressure due to available production.


1976 ◽  
Vol 16 (06) ◽  
pp. 337-350 ◽  
Author(s):  
G.R. Gavalas ◽  
P.C. Shah ◽  
J.H. Seinfeld

Abstract The estimation of reservoir properties is inherently an underdetermined problem (one having a nonunique solution) because of the large number of unknown parameters relative to the available data. parameters relative to the available data. The common zonation approach to reducing the number of parameters introduces considerable modeling error by insisting that reservoir properties are uniform within each zone and by assigning the boundaries of these zones more or less arbitrarily. In this paper, Bayesian estimation theory is applied to history matching as an alternative to zonation. By using a priori statistical information on the unknown parameters, the problem becomes statistically better determined. Bayesian estimation and zonation are applied to the problem of porosity and permeability estimation in a one-dimensional, permeability estimation in a one-dimensional, one-phase reservoir. Introduction The estimation of parameters such as porosity and permeability in a reservoir model using well production and pressure data is commonly referred production and pressure data is commonly referred to as history matching. Although an inhomogeneous reservoir is in principle specified by an infinite number of parameters, a computational reservoir model can only contain a finite number. The most detailed description is obtained by allowing porosity and permeability to vary independently at each block of the spatial grid used in the finite-difference solution. While minimizing the modeling error, this approach entails a great deal of uncertainty because of the large number of unknowns compared with the limited data available. Thus, in a given problem, many different sets of property estimates problem, many different sets of property estimates may provide satisfactory and essentially indistinquishable data fits. Some of these parameter estimates can be grossly in error with respect to the actual properties, and as a result can lead to erroneous prediction of future reservoir behavior. To reduce the statistical uncertainty one must either decrease the number of unknowns or utilize additional information. A commonly used procedure for reducing the number of unknown parameters is zonation; the reservoir is divided into a small number of zones, in each of which the properties are treated as uniform. A modeling error is thus introduced through the assumption of uniform properties within each zone and through the more or less arbitrary assignment of the zone boundaries. As the number of zones is decreased, the error due to statistical uncertainty decreases while the modeling error increases. The total error passes through a minimum at some intermediate number of zones. The specification of this optimum level of description, which has been briefly considered in past work, will be treated in detail in a future report. An alternative to decreasing the statistical uncertainty by reducing the number of unknown parameters is the utilization of additional parameters is the utilization of additional information. This information need not be limited to measurements on the reservoir under study, but can be based on prior geological information about property variability in reservoirs of the same type. property variability in reservoirs of the same type. This paper examines this alternative method of reducing statistical uncertainty. The prior geological information is utilized by a formulation akin to classical Bayesian estimation. The Bayesian estimation is illustrated and is compared with the zonation approach for the case of a hypothetical, one-dimensional reservoir with variable porosity and permeability. The numerical simulations are used to investigate questions such as the optimum number of parameters in zonation and the effect of erroneous prior statistics in Bayesian estimation, and to compare the two methods. Considerable attention is also given to computational aspects such as convergence rate and computer time required by two of the most commonly used minimization algorithms, Marquardt's and the conjugate gradient. NATURE OF PRIOR GEOLOGICAL INFORMATION The application of probabilistic models in geology is the subject of a recent review. SPEJ P. 337


2017 ◽  
Vol 5 (3) ◽  
pp. SJ21-SJ30 ◽  
Author(s):  
Ryan Michael Williams ◽  
Enric Pascual-Cebrian ◽  
Jon Charles Gutmanis ◽  
Gaynor Suzanne Paton

The “seismic resolution gap” has been an area of ambiguity ever since the results of 3D seismic interpretation have been used as inputs for modeling purposes because many important structural events such as fractures are at or below seismic resolution, which can impinge reservoir properties such as porosity and permeability. Having the means to accurately map these events with confidence has always been a challenge. More often than not, localized mapping of these features at borehole conditions can be achieved by core or image-log analysis. Seismic-derived attributes have assisted in improving the interwell geologic understanding in a lateral sense, but they are always hampered by vertical resolution. Enhanced imaging, such as cyan-magenta-yellow blending of attributes, has helped improve the lateral understanding of fracture patterns and networks, as shown in this workflow, but the challenge with vertical resolution still persists. However, by combining borehole and seismic data studies in a distinct workflow, it has become possible to identify overlaps and misalignments, which in turn has assisted in identification of discrete structural patterns not previously identified because of the seismic resolution gap. These results will then be used to improve the confidence of structural interpretation and static fracture models, which all go toward improving reservoir simulation models and geologic understanding.


2019 ◽  
Author(s):  
Esmail Ansari ◽  
◽  
Tandis S. Bidgoli ◽  
Andrew Michael Hollenbach

2021 ◽  
Author(s):  
Mokhles Mezghani ◽  
Mustafa AlIbrahim ◽  
Majdi Baddourah

Abstract Reservoir simulation is a key tool for predicting the dynamic behavior of the reservoir and optimizing its development. Fine scale CPU demanding simulation grids are necessary to improve the accuracy of the simulation results. We propose a hybrid modeling approach to minimize the weight of the full physics model by dynamically building and updating an artificial intelligence (AI) based model. The AI model can be used to quickly mimic the full physics (FP) model. The methodology that we propose consists of starting with running the FP model, an associated AI model is systematically updated using the newly performed FP runs. Once the mismatch between the two models is below a predefined cutoff the FP model is switch off and only the AI model is used. The FP model is switched on at the end of the exercise either to confirm the AI model decision and stop the study or to reject this decision (high mismatch between FP and AI model) and upgrade the AI model. The proposed workflow was applied to a synthetic reservoir model, where the objective is to match the average reservoir pressure. For this study, to better account for reservoir heterogeneity, fine scale simulation grid (approximately 50 million cells) is necessary to improve the accuracy of the reservoir simulation results. Reservoir simulation using FP model and 1024 CPUs requires approximately 14 hours. During this history matching exercise, six parameters have been selected to be part of the optimization loop. Therefore, a Latin Hypercube Sampling (LHS) using seven FP runs is used to initiate the hybrid approach and build the first AI model. During history matching, only the AI model is used. At the convergence of the optimization loop, a final FP model run is performed either to confirm the convergence for the FP model or to re iterate the same approach starting from the LHS around the converged solution. The following AI model will be updated using all the FP simulations done in the study. This approach allows the achievement of the history matching with very acceptable quality match, however with much less computational resources and CPU time. CPU intensive, multimillion-cell simulation models are commonly utilized in reservoir development. Completing a reservoir study in acceptable timeframe is a real challenge for such a situation. The development of new concepts/techniques is a real need to successfully complete a reservoir study. The hybrid approach that we are proposing is showing very promising results to handle such a challenge.


2021 ◽  
Vol 73 (04) ◽  
pp. 60-61
Author(s):  
Chris Carpenter

This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17–19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17-19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction The concept of rate transient analysis (RTA) involves the use of rate and pressure trends of producing wells to estimate properties such as permeability and fracture surface area. While very useful, RTA is an analytical technique and has commensurate limitations. In the complete paper, different RTA motifs are generated using a simulator. Insights from these motif simulations are used to modify simulation parameters to expediate and inform the history- matching process. The simulation history-matching work flow presented includes the following steps: 1 - Set up a simulation model with geologic properties, wellbore and completion designs, and fracturing and production schedules 2 - Run an initial model 3 - Tune the fracture geometries (height and length) to heuristic data: microseismic, frac-hit data, distributed acoustic sensing, or other diagnostics 4 - Match instantaneous shut-in pressure (ISIP) and wellhead pressure (WHP) during injection 5 - Make RTA plots of the real and simulated production data 6 - Use the motifs presented in the paper to identify possible production mechanisms in the real data 7 - Adjust history-matching parameters in the simulation model based on the intuition gained from RTA of the real data 8 -Iterate Steps 5 through 7 to obtain a match in RTA trends 9 - Modify relative permeabilities as necessary to obtain correct oil, water, and gas proportions In this study, the authors used a commercial simulator that fully integrates hydraulic fracturing, wellbore, and reservoir simulation into a single modeling code. Matching Fracturing Data The complete paper focuses on matching production data, assisted by RTA, not specifically on the matching of fracturing data such as injection pressure and fracture geometry (Steps 3 and 4). Nevertheless, for completeness, these steps are very briefly summarized in this section. Effective fracture toughness is the most-important factor in determining fracture length. Field diagnostics suggest considerable variability in effective fracture toughness and fracture length. Typical half-lengths are between 500 and 2,000 ft. Laboratory-derived values of fracture toughness yield longer fractures (propagation of 2,000 ft or more from the wellbore). Significantly larger values of fracture toughness are needed to explain the shorter fracture length and higher net pressure values that are often observed. The authors use a scale- dependent fracture-toughness parameter to increase toughness as the fracture grows. This allows the simulator to match injection pressure data while simultaneously limiting fracture length. This scale-dependent toughness scaling parameter is the most-important parameter in determining fracture size.


2021 ◽  
Author(s):  
Yifei Xu ◽  
Priyesh Srivastava ◽  
Xiao Ma ◽  
Karan Kaul ◽  
Hao Huang

Abstract In this paper, we introduce an efficient method to generate reservoir simulation grids and modify the fault juxtaposition on the generated grids. Both processes are based on a mapping method to displace vertices of a grid to desired locations without changing the grid topology. In the gridding process, a grid that can capture stratigraphical complexity is first generated in an unfaulted space. The vertices of the grid are then displaced back to the original faulted space to become a reservoir simulation grid. The resulting reversely mapped grid has a mapping structure that allows fast and easy fault juxtaposition modification. This feature avoids the process of updating the structural framework and regenerating the reservoir properties, which may be time-consuming. To facilitate juxtaposition updates within an assisted history matching workflow, several parameterized fault throw adjustment methods are introduced. Grid examples are given for reservoirs with Y-faults, overturned bed, and complex channel-lobe systems.


Sign in / Sign up

Export Citation Format

Share Document