Calibration of facies proportions through a history-matching process

2009 ◽  
Vol 180 (5) ◽  
pp. 387-397 ◽  
Author(s):  
Catherine Ponsot-Jacquin ◽  
Frédéric Roggero ◽  
Guillaume Enchery

Abstract The facies-proportion calibration method is a new history-matching technique, which modifies facies proportions within a fine geological – geostatistical model until a good match of the field data is reached. The initial facies proportions in the model are usually locally constrained by well data, for example, but their spatial tendencies may be unreliable in some parts of the reservoir. The algorithm presented in this paper introduces average proportion ratios between facies groups in order to calculate new facies proportions while taking into account their initial values. It can be applied locally on specific regions or globally on the whole reservoir for stationary or non-stationary facies distributions. The proportion ratios can be manually adjusted or iteratively computed through an optimization process. The method has been successfully applied to a real field case.

Author(s):  
Amitabh Kumar ◽  
Brian McShane ◽  
Mark McQueen

A large Oil and Gas pipeline gathering system is commonly used to transport processed oil and gas from an offshore platform to an onshore receiving facility. High reliability and integrity for continuous operation of these systems is crucial to ensure constant supply of hydrocarbon to the onshore processing facility and eventually to market. When such a system is exposed to a series of complex environmental loadings, it is often difficult to predict the response path, in-situ condition and therefore the system’s ability to withstand subsequent future loading scenarios. In order to continue to operate the pipeline after a significant environmental event, an overall approach needs to be developed to — (a) Understand the system loading and the associated integrity, (b) Develop a series of criteria staging the sequence of actions following an event that will verify the pipeline integrity and (c) Ensure that the integrity management solution is simple and easy to understand so that it can be implemented consistently. For a complex loading scenario, one of the main challenges is the ability to predict the controlling parameter(s) that drives the global integrity of these systems. In such scenarios, the presence of numerous parameters makes the technical modeling and prediction tasks arduous. To address such scenarios, first and foremost, it is crucial to understand the baseline environment data and other associated critical design input elements. If the “design environmental baseline” has transformed (due to large events e.g. storms etc.) from its original condition; it modifies the dynamics of the system. To address this problem, a thorough modeling and assessment of the in-situ condition is essential. Further, a robust calibration method is required to predict the future response path and therefore expected pipeline condition. The study further compares the planned integrity management solutions to the field data to validate the efficiency of the predicted scenarios. By the inclusion of real field-data feedback to the modeling method, balanced integrity solutions can be achieved and the ability to quantify the risks is made more practical and actionable.


2002 ◽  
Vol 5 (02) ◽  
pp. 126-134 ◽  
Author(s):  
R.O. Baker ◽  
F. Kuppe ◽  
S. Chugh ◽  
R. Bora ◽  
S. Stojanovic ◽  
...  

Summary Modern streamline-based reservoir simulators are able to account for actual field conditions such as 3D multiphase flow effects, reservoir heterogeneity, gravity, and changing well conditions. A streamline simulator was used to model four field cases, with approximately 400 wells and 150,000 gridblocks. History-match run times were approximately 1 CPU hour per run, with the final history matches completed in approximately 1 month per field. In all field cases, a high percentage of wells were history matched within the first two to three runs. Streamline simulation not only enables a rapid turnaround time for studies, but it also serves as a different tool in resolving each of the studied fields' unique characteristics. The primary reasons for faster history matching of permeability fields using 3D streamline technology as compared to conventional finite-difference (FD) techniques are as follows: Streamlines clearly identify which producer-injector pairs communicate strongly (flow visualization). Streamlines allow the use of a very large number of wells, thereby substantially reducing the uncertainty associated with outer-boundary conditions. Streamline flow paths indicate that idealized drainage patterns do not exist in real fields. It is therefore unrealistic to extract symmetric elements out of a full field. The speed and efficiency of the method allows the solution of fine-scale and/or full-field models with hundreds of wells. The streamline simulator honors the historical total fluid injection and production volumes exactly because there are no drawdown constraints for incompressible problems. The technology allows for easy identification of regions that require modifications to achieve a history match. Streamlines provide new flow information (i.e., well connectivity, drainage volumes, and well allocation factors) that cannot be derived from conventional simulation methods. Introduction In the past, streamline-based flow simulation was quite limited in its application to field data. Emanuel and Milliken1 showed how hybrid streamtube models were used to history match field data rapidly to arrive at both an updated geologic model and a current oil-saturation distribution for input to FD simulations. FD simulators were then used in forecast mode. Recent advances in streamline-based flow simulators have overcome many of the limitations of previous streamline and streamtube methods.2-6 Streamline-based simulators are now fully 3D and account for multiphase gravity and fluid mobility effects as well as compressibility effects. Another key improvement is that the simulator can now account for changing well conditions due to rate changes, infill drilling, producer-injector conversions, and well abandonments. With advances in streamline methods, the technique is rapidly becoming a common tool to assist in the modeling and forecasting of field cases. As this technology has matured, it is becoming available to a larger group of engineers and is no longer confined to research centers. Published case studies using streamline simulators are now appearing from a broad distribution of sources.7–12 Because of the increasing interest in this technology, our first intent in this paper is to outline a methodology for where and how streamline-based simulation fits in the reservoir engineering toolbox. Our second objective is to provide insight into why we think the method works so well in some cases. Finally, we will demonstrate the application of the technology to everyday field situations useful to mainstream exploitation or reservoir engineers, as opposed to specialized or research applications. The Streamline Simulation Method For a more detailed mathematical description of the streamline method, please refer to the Appendix and subsequent references. In brief, the streamline simulation method solves a 3D problem by decoupling it into a series of 1D problems, each one solved along a streamline. Unlike FD simulation, streamline simulation relies on transporting fluids along a dynamically changing streamline- based flow grid, as opposed to the underlying Cartesian grid. The result is that large timestep sizes can be taken without numerical instabilities, giving the streamline method a near-linear scaling in terms of CPU efficiency vs. model size.6 For very large models, streamline-based simulators can be one to two orders of magnitude faster than FD methods. The timestep size in streamline methods is not limited by a classic grid throughput (CFL) condition but by how far fluids can be transported along the current streamline grid before the streamlines need to be updated. Factors that influence this limit include nonlinear effects like mobility, gravity, and well rate changes.5 In real field displacements, historical well effects have a far greater impact on streamline-pattern changes than do mobility and gravity. Thus, the key is determining how much historical data can be upscaled without significantly impacting simulation results. For all cases considered here, 1-year timestep sizes were more than adequate to capture changes in historical data, gravity, and mobility effects. It is worth noting that upscaling historical data also would benefit run times for FD simulations. Where possible, both SL and FD methods would then require similar simulation times. However, only for very coarse grids and specific problems is it possible to take 1-year timestep sizes with FD methods. As the grid becomes finer, CFL limitations begin to dictate the timestep size, which is much smaller than is necessary to honor nonlinearities. This is why streamline methods exhibit larger speed-up factors over FD methods as the number of grid cells increases.


Fuels ◽  
2021 ◽  
Vol 2 (3) ◽  
pp. 286-303
Author(s):  
Vuong Van Pham ◽  
Ebrahim Fathi ◽  
Fatemeh Belyadi

The success of machine learning (ML) techniques implemented in different industries heavily rely on operator expertise and domain knowledge, which is used in manually choosing an algorithm and setting up the specific algorithm parameters for a problem. Due to the manual nature of model selection and parameter tuning, it is impossible to quantify or evaluate the quality of this manual process, which in turn limits the ability to perform comparison studies between different algorithms. In this study, we propose a new hybrid approach for developing machine learning workflows to help automated algorithm selection and hyperparameter optimization. The proposed approach provides a robust, reproducible, and unbiased workflow that can be quantified and validated using different scoring metrics. We have used the most common workflows implemented in the application of artificial intelligence (AI) and ML in engineering problems including grid/random search, Bayesian search and optimization, genetic programming, and compared that with our new hybrid approach that includes the integration of Tree-based Pipeline Optimization Tool (TPOT) and Bayesian optimization. The performance of each workflow is quantified using different scoring metrics such as Pearson correlation (i.e., R2 correlation) and Mean Square Error (i.e., MSE). For this purpose, actual field data obtained from 1567 gas wells in Marcellus Shale, with 121 features from reservoir, drilling, completion, stimulation, and operation is tested using different proposed workflows. A proposed new hybrid workflow is then used to evaluate the type well used for evaluation of Marcellus shale gas production. In conclusion, our automated hybrid approach showed significant improvement in comparison to other proposed workflows using both scoring matrices. The new hybrid approach provides a practical tool that supports the automated model and hyperparameter selection, which is tested using real field data that can be implemented in solving different engineering problems using artificial intelligence and machine learning. The new hybrid model is tested in a real field and compared with conventional type wells developed by field engineers. It is found that the type well of the field is very close to P50 predictions of the field, which shows great success in the completion design of the field performed by field engineers. It also shows that the field average production could have been improved by 8% if shorter cluster spacing and higher proppant loading per cluster were used during the frac jobs.


2021 ◽  
Vol 11 (11) ◽  
pp. 5025
Author(s):  
David González-Peña ◽  
Ignacio García-Ruiz ◽  
Montserrat Díez-Mediavilla ◽  
Mª. Isabel Dieste-Velasco ◽  
Cristina Alonso-Tristán

Prediction of energy production is crucial for the design and installation of PV plants. In this study, five free and commercial software tools to predict photovoltaic energy production are evaluated: RETScreen, Solar Advisor Model (SAM), PVGIS, PVSyst, and PV*SOL. The evaluation involves a comparison of monthly and annually predicted data on energy supplied to the national grid with real field data collected from three real PV plants. All the systems, located in Castile and Leon (Spain), have three different tilting systems: fixed mounting, horizontal-axis tracking, and dual-axis tracking. The last 12 years of operating data, from 2008 to 2020, are used in the evaluation. Although the commercial software tools were easier to use and their installations could be described in detail, their results were not appreciably superior. In annual global terms, the results hid poor estimations throughout the year, where overestimations were compensated by underestimated results. This fact was reflected in the monthly results: the software yielded overestimates during the colder months, while the models showed better estimates during the warmer months. In most studies, the deviation was below 10% when the annual results were analyzed. The accuracy of the software was also reduced when the complexity of the dual-axis solar tracking systems replaced the fixed installation.


SPE Journal ◽  
2017 ◽  
Vol 22 (05) ◽  
pp. 1506-1518 ◽  
Author(s):  
Pedram Mahzari ◽  
Mehran Sohrabi

Summary Three-phase flow in porous media during water-alternating-gas (WAG) injections and the associated cycle-dependent hysteresis have been subject of studies experimentally and theoretically. In spite of attempts to develop models and simulation methods for WAG injections and three-phase flow, current lack of a solid approach to handle hysteresis effects in simulating WAG-injection scenarios has resulted in misinterpretations of simulation outcomes in laboratory and field scales. In this work, by use of our improved methodology, the first cycle of the WAG experiments (first waterflood and the subsequent gasflood) was history matched to estimate the two-phase krs (oil/water and gas/oil). For subsequent cycles, pertinent parameters of the WAG hysteresis model are included in the automatic-history-matching process to reproduce all WAG cycles together. The results indicate that history matching the whole WAG experiment would lead to a significantly improved simulation outcome, which highlights the importance of two elements in evaluating WAG experiments: inclusion of the full WAG experiments in history matching and use of a more-representative set of two-phase krs, which was originated from our new methodology to estimate two-phase krs from the first cycle of a WAG experiment. Because WAG-related parameters should be able to model any three-phase flow irrespective of WAG scenarios, in another exercise, the tuned parameters obtained from a WAG experiment (starting with water) were used in a similar coreflood test (WAG starting with gas) to assess predictive capability for simulating three-phase flow in porous media. After identifying shortcomings of existing models, an improved methodology was used to history match multiple coreflood experiments simultaneously to estimate parameters that can reasonably capture processes taking place in WAG at different scenarios—that is, starting with water or gas. The comprehensive simulation study performed here would shed some light on a consolidated methodology to estimate saturation functions that can simulate WAG injections at different scenarios.


2021 ◽  
Vol 73 (04) ◽  
pp. 60-61
Author(s):  
Chris Carpenter

This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17–19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17-19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction The concept of rate transient analysis (RTA) involves the use of rate and pressure trends of producing wells to estimate properties such as permeability and fracture surface area. While very useful, RTA is an analytical technique and has commensurate limitations. In the complete paper, different RTA motifs are generated using a simulator. Insights from these motif simulations are used to modify simulation parameters to expediate and inform the history- matching process. The simulation history-matching work flow presented includes the following steps: 1 - Set up a simulation model with geologic properties, wellbore and completion designs, and fracturing and production schedules 2 - Run an initial model 3 - Tune the fracture geometries (height and length) to heuristic data: microseismic, frac-hit data, distributed acoustic sensing, or other diagnostics 4 - Match instantaneous shut-in pressure (ISIP) and wellhead pressure (WHP) during injection 5 - Make RTA plots of the real and simulated production data 6 - Use the motifs presented in the paper to identify possible production mechanisms in the real data 7 - Adjust history-matching parameters in the simulation model based on the intuition gained from RTA of the real data 8 -Iterate Steps 5 through 7 to obtain a match in RTA trends 9 - Modify relative permeabilities as necessary to obtain correct oil, water, and gas proportions In this study, the authors used a commercial simulator that fully integrates hydraulic fracturing, wellbore, and reservoir simulation into a single modeling code. Matching Fracturing Data The complete paper focuses on matching production data, assisted by RTA, not specifically on the matching of fracturing data such as injection pressure and fracture geometry (Steps 3 and 4). Nevertheless, for completeness, these steps are very briefly summarized in this section. Effective fracture toughness is the most-important factor in determining fracture length. Field diagnostics suggest considerable variability in effective fracture toughness and fracture length. Typical half-lengths are between 500 and 2,000 ft. Laboratory-derived values of fracture toughness yield longer fractures (propagation of 2,000 ft or more from the wellbore). Significantly larger values of fracture toughness are needed to explain the shorter fracture length and higher net pressure values that are often observed. The authors use a scale- dependent fracture-toughness parameter to increase toughness as the fracture grows. This allows the simulator to match injection pressure data while simultaneously limiting fracture length. This scale-dependent toughness scaling parameter is the most-important parameter in determining fracture size.


2021 ◽  
Author(s):  
Ali Al-Turki ◽  
Obai Alnajjar ◽  
Majdi Baddourah ◽  
Babatunde Moriwawon

Abstract The algorithms and workflows have been developed to couple efficient model parameterization with stochastic, global optimization using a Multi-Objective Genetic Algorithm (MOGA) for global history matching, and coupled with an advanced workflow for streamline sensitivity-based inversion for fine-tuning. During parameterization the low-rank subsets of most influencing reservoir parameters are identified and propagated to MOGA to perform the field-level history match. Data misfits between the field historical data and simulation data are calculated with multiple realizations of reservoir models that quantify and capture reservoir uncertainty. Each generation of the optimization algorithms reduces the data misfit relative to the previous iteration. This iterative process continues until a satisfactory field-level history match is reached or there are no further improvements. The fine-tuning process of well-connectivity calibration is then performed with a streamlined sensitivity-based inversion algorithm to locally update the model to reduce well-level mismatch. In this study, an application of the proposed algorithms and workflow is demonstrated for model calibration and history matching. The synthetic reservoir model used in this study is discretized into millions of grid cells with hundreds of producer and injector wells. It is designed to generate several decades of production and injection history to evaluate and demonstrate the workflow. In field-level history matching, reservoir rock properties (e.g., permeability, fault transmissibility, etc.) are parameterized to conduct the global match of pressure and production rates. Grid Connectivity Transform (GCT) was used and assessed to parameterize the reservoir properties. In addition, the convergence rate and history match quality of MOGA was assessed during the field (global) history matching. Also, the effectiveness of the streamline-based inversion was evaluated by quantifying the additional improvement in history matching quality per well. The developed parametrization and optimization algorithms and workflows revealed the unique features of each of the algorithms for model calibration and history matching. This integrated workflow has successfully defined and carried uncertainty throughout the history matching process. Following the successful field-level history match, the well-level history matching was conducted using streamline sensitivity-based inversion, which further improved the history match quality and conditioned the model to historical production and injection data. In general, the workflow results in enhanced history match quality in a shorter turnaround time. The geological realism of the model is retained for robust prediction and development planning.


SPE Journal ◽  
2018 ◽  
Vol 23 (05) ◽  
pp. 1496-1517 ◽  
Author(s):  
Chaohui Chen ◽  
Guohua Gao ◽  
Ruijian Li ◽  
Richard Cao ◽  
Tianhong Chen ◽  
...  

Summary Although it is possible to apply traditional optimization algorithms together with the randomized-maximum-likelihood (RML) method to generate multiple conditional realizations, the computation cost is high. This paper presents a novel method to enhance the global-search capability of the distributed-Gauss-Newton (DGN) optimization method and integrates it with the RML method to generate multiple realizations conditioned to production data synchronously. RML generates samples from an approximate posterior by minimizing a large ensemble of perturbed objective functions in which the observed data and prior mean values of uncertain model parameters have been perturbed with Gaussian noise. Rather than performing these minimizations in isolation using large sets of simulations to evaluate the finite-difference approximations of the gradients used to optimize each perturbed realization, we use a concurrent implementation in which simulation results are shared among different minimization tasks whenever these results are helping to converge to the global minimum of a specific minimization task. To improve sharing of results, we relax the accuracy of the finite-difference approximations for the gradients with more widely spaced simulation results. To avoid trapping in local optima, a novel method to enhance the global-search capability of the DGN algorithm is developed and integrated seamlessly with the RML formulation. In this way, we can improve the quality of RML conditional realizations that sample the approximate posterior. The proposed work flow is first validated with a toy problem and then applied to a real-field unconventional asset. Numerical results indicate that the new method is very efficient compared with traditional methods. Hundreds of data-conditioned realizations can be generated in parallel within 20 to 40 iterations. The computational cost (central-processing-unit usage) is reduced significantly compared with the traditional RML approach. The real-field case studies involve a history-matching study to generate history-matched realizations with the proposed method and an uncertainty quantification of production forecasting using those conditioned models. All conditioned models generate production forecasts that are consistent with real-production data in both the history-matching period and the blind-test period. Therefore, the new approach can enhance the confidence level of the estimated-ultimate-recovery (EUR) assessment using production-forecasting results generated from all conditional realizations, resulting in significant business impact.


2021 ◽  
Author(s):  
Son Hoang ◽  
Tung Tran ◽  
Tan Nguyen ◽  
Tu Truong ◽  
Duy Pham ◽  
...  

Abstract This paper reports a successful case study of applying machine learning to improve the history matching process, making it easier, less time-consuming, and more accurate, by determining whether Local Grid Refinement (LGR) with transmissibility multiplier is needed to history match gas-condensate wells producing from geologically complex reservoirs as well as determining the required LGR setup to history match those gas-condensate producers. History matching Hai Thach gas-condensate production wells is extremely challenging due to the combined effect of condensate banking, sub-seismic fault network, complex reservoir distribution and connectivity, uncertain HIIP, and lack of PVT data for most reservoirs. In fact, for some wells, many trial simulation runs were conducted before it became clear that LGR with transmissibility multiplier was required to obtain good history matching. In order to minimize this time-consuming trial-and-error process, machine learning was applied in this study to analyze production data using synthetic samples generated by a very large number of compositional sector models so that the need for LGR could be identified before the history matching process begins. Furthermore, machine learning application could also determine the required LGR setup. The method helped provide better models in a much shorter time, and greatly improved the efficiency and reliability of the dynamic modeling process. More than 500 synthetic samples were generated using compositional sector models and divided into separate training and test sets. Multiple classification algorithms such as logistic regression, Gaussian Naive Bayes, Bernoulli Naive Bayes, multinomial Naive Bayes, linear discriminant analysis, support vector machine, K-nearest neighbors, and Decision Tree as well as artificial neural networks were applied to predict whether LGR was used in the sector models. The best algorithm was found to be the Decision Tree classifier, with 100% accuracy on the training set and 99% accuracy on the test set. The LGR setup (size of LGR area and range of transmissibility multiplier) was also predicted best by the Decision Tree classifier with 91% accuracy on the training set and 88% accuracy on the test set. The machine learning model was validated using actual production data and the dynamic models of history-matched wells. Finally, using the machine learning prediction on wells with poor history matching results, their dynamic models were updated and significantly improved.


Sign in / Sign up

Export Citation Format

Share Document