Machine Learning for Capillary Pressure Estimation

2021 ◽  
pp. 1-20
Author(s):  
A. A. Kasha ◽  
A. Sakhaee-Pour ◽  
I. A. Hussein

Summary Capillary pressure plays an essential role in controlling multiphase flow in porous media and is often difficult to be estimated at subsurface conditions. The Leverett capillary pressure function J provides a convenient tool to address this shortcoming; however, its performance remains poor where there is a large scatter in the scaled data. Our aim, therefore, was to reduce the gaps between J curves and to develop a method that allows accurate scaling of capillary pressure. We developed two mathematical expressions based on permeability and porosity values of 214 rock samples taken from North America and the Middle East. Using the values as grouping features, we used pattern-recognition algorithms in machine learning to cluster the original data into different groups. In each wetting phase saturation, we were able to quantify the gaps between the J curves by determining the ratio of the maximum J to the minimum J. Graphical maps were developed to identify the corresponding group for a new rock sample after which the capillary pressure is estimated using the average J curve of the identified group and the permeability and porosity values of the rock sample. This method also provides better performance than the flow zone indicator (FZI) approach. The proposed technique was validated on six rock types and has successfully generated average capillary pressure curves that capture the trends and values of the experimentally measured data by mercury injection. Moreover, the proposed methodology in this study provides an advanced and a machine-learning-oriented approach for rock typing. In this paper, we provide a reliable and easy-to-use method for capillary pressure estimation in the absence of experimentally measured data by mercury injection.

2021 ◽  
Author(s):  
Carlos Esteban Alfonso ◽  
Frédérique Fournier ◽  
Victor Alcobia

Abstract The determination of the petrophysical rock-types often lacks the inclusion of measured multiphase flow properties as the relative permeability curves. This is either the consequence of a limited number of SCAL relative permeability experiments, or due to the difficulty of linking the relative permeability characteristics to standard rock-types stemming from porosity, permeability and capillary pressure. However, as soon as the number of relative permeability curves is significant, they can be processed under the machine learning methodology stated by this paper. The process leads to an automatic definition of relative permeability based rock-types, from a precise and objective characterization of the curve shapes, which would not be achieved with a manual process. It improves the characterization of petrophysical rock-types, prior to their use in static and dynamic modeling. The machine learning approach analyzes the shapes of curves for their automatic classification. It develops a pattern recognition process combining the use of principal component analysis with a non-supervised clustering scheme. Before this, the set of relative permeability curves are pre-processed (normalization with the integration of irreducible water and residual oil saturations for the SCAL relative permeability samples from an imbibition experiment) and integrated under fractional flow curves. Fractional flow curves proved to be an effective way to unify the relative permeability of the two fluid phases, in a unique curve that characterizes the specific poral efficiency displacement of this rock sample. The methodology has been tested in a real data set from a carbonate reservoir having a significant number of relative permeability curves available for the study, in addition to capillary pressure, porosity and permeability data. The results evidenced the successful grouping of the relative permeability samples, according to their fractional flow curves, which allowed the classification of the rocks from poor to best displacement efficiency. This demonstrates the feasibility of the machine learning process for defining automatically rock-types from relative permeability data. The fractional flow rock-types were compared to rock-types obtained from capillary pressure analysis. The results indicated a lack of correspondence between the two series of rock-types, which testifies the additional information brought by the relative permeability data in a rock-typing study. Our results also expose the importance of having good quality SCAL experiments, with an accurate characterization of the saturation end-points, which are used for the normalization of the curves, and a consistent sampling for both capillary pressure and relative permeability measurements.


2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
Olugbenga Falode ◽  
Edo Manuel

An understanding of the mechanisms by which oil is displaced from porous media requires the knowledge of the role of wettability and capillary forces in the displacement process. The determination of representative capillary pressure (Pc) data and wettability index of a reservoir rock is needed for the prediction of the fluids distribution in the reservoir: the initial water saturation and the volume of reserves. This study shows how wettability alteration of an initially water-wet reservoir rock to oil-wet affects the properties that govern multiphase flow in porous media, that is, capillary pressure, relative permeability, and irreducible saturation. Initial water-wet reservoir core samples with porosities ranging from 23 to 33%, absolute air permeability of 50 to 233 md, and initial brine saturation of 63 to 87% were first tested as water-wet samples under air-brine system. This yielded irreducible wetting phase saturation of 19 to 21%. The samples were later tested after modifying their wettability to oil-wet using a surfactant obtained from glycerophtalic paint; and the results yielded irreducible wetting phase saturation of 25 to 34%. From the results of these experiments, changing the wettability of the samples to oil-wet improved the recovery of the wetting phase.


2017 ◽  
Vol 21 (2) ◽  
pp. 1063-1076 ◽  
Author(s):  
James E. McClure ◽  
Amanda L. Dye ◽  
Cass T. Miller ◽  
William G. Gray

Abstract. As a tool for addressing problems of scale, we consider an evolving approach known as the thermodynamically constrained averaging theory (TCAT), which has broad applicability to hydrology. We consider the case of modeling of two-fluid-phase flow in porous media, and we focus on issues of scale as they relate to various measures of pressure, capillary pressure, and state equations needed to produce solvable models. We apply TCAT to perform physics-based data assimilation to understand how the internal behavior influences the macroscale state of two-fluid porous medium systems. A microfluidic experimental method and a lattice Boltzmann simulation method are used to examine a key deficiency associated with standard approaches. In a hydrologic process such as evaporation, the water content will ultimately be reduced below the irreducible wetting-phase saturation determined from experiments. This is problematic since the derived closure relationships cannot predict the associated capillary pressures for these states. We demonstrate that the irreducible wetting-phase saturation is an artifact of the experimental design, caused by the fact that the boundary pressure difference does not approximate the true capillary pressure. Using averaging methods, we compute the true capillary pressure for fluid configurations at and below the irreducible wetting-phase saturation. Results of our analysis include a state function for the capillary pressure expressed as a function of fluid saturation and interfacial area.


2021 ◽  
Vol 73 (01) ◽  
pp. 44-45
Author(s):  
Chris Carpenter

This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper IPTC 19854, “Modeling and Prediction of Resistivity, Capillary Pressure, and Relative Permeability Using Artificial Neural Network,” by Mustafa Ba alawi, SPE, King Fahd University of Petroleum and Minerals; Salem Gharbi, SPE, Saudi Aramco; and Mohamed Mahmoud, King Fahd University of Petroleum and Minerals, prepared for the 2020 International Petroleum Technology Conference, Dhahran, Saudi Arabia, 13–15 January. The paper has not been peer reviewed. Copyright 2020 International Petroleum Technology Conference. Reproduced by permission. Capillary pressure and relative permeability are essential measurements that affect multiphase fluid flow in porous media directly. The processes of measuring these parameters, however, are both time-consuming and expensive. Artificial-intelligence methods have achieved promising results in modeling extremely complicated phenomena in the industry. In the complete paper, the authors generate a model by using an artificial-neural-network (ANN) technique to predict both capillary pressure and relative permeability from resistivity. Capillary Pressure and Resistivity Capillary pressure and resistivity are two of the most significant parameters governing fluid flow in oil and gas reservoirs. Capillary pressure, the pressure difference over the interface of two different immiscible fluids, traditionally is measured in the laboratory. The difficulty of its calculation is related to the challenges of maintaining reservoir conditions and the complexity of dealing with low-permeability and strong heterogeneous samples. Moreover, unless the core materials are both available and representative, a restricted number of core plugs will lead to inadequate reservoir description. On the other hand, resistivity can be obtained by either lab-oratory analysis or through typical and routine well-logging tools in real time. Both parameters have similar attributes, given their dependence on wetting-phase saturation. Despite many studies in the literature that are reviewed in the complete paper, improvement of capillary pressure and resistivity modeling remains an open research area. Artificial Intelligence in Petroleum Engineering In addition to labor and expense concerns, conventional methods to measure resistivity, capillary pressure, and relative permeability depend primarily, with the exception of resistivity from well logs, on the availability of core samples of the desired reservoir. The literature provides several attempts to model these parameters in order to avoid many of the requirements of measurement. However, the performance of many of these models is restricted by assumptions and constraints that require further processing. For example, the accuracy of prediction of capillary pressure from resistivity is highly dependent on the tested core permeability, which requires measuring it as well to achieve the full potentiality of the model.


2016 ◽  
Author(s):  
J. McClure ◽  
A. Dye ◽  
C. Miller ◽  
W. Gray

Abstract. The career of Professor Eric F. Wood has focused on the resolution of problems of scale in hydrologic systems. Within this context, we consider an evolving approach known as the thermodynamically constrained averaging theory (TCAT), which has broad applicability to hydrology. Specifically, we consider the case of modeling of two-fluid-phase flow in porous media. Two-fluid flow processes in the subsurface are fundamentally important for a wide range of hydrologic processes, including the transport of water and air in the vadose zone and geological carbon sequestration. Mathematical models that describe these complex processes have long relied on empirical approaches that neglect important aspects of the system behavior. New data sources make it possible to access the true geometry of geologic materials and directly measure previously inaccessible quantities. This information can be exploited to support a new generation of theoretical models that are constructed based on rigorous multiscale principles for thermodynamics and continuum mechanics. The challenges to constructing a mature model are shown to involve issues of scale, consistency requirements, appropriate representation of operative physical mechanisms at the target scale of the model, and a robust structure to support model evaluation, validation, and refinement. We apply TCAT to perform physics-based data assimilation to understand how the internal behavior influences the macroscale state of two-fluid porous medium systems. Examples of a microfluidic experimental method and a lattice Boltzmann simulation method are used to examine a key deficiency associated with standard approaches. In a hydrologic process such as evaporation, the water content will ultimately be reduced below the irreducible wetting phase saturation determined from experiments. This is problematic since the derived closure relationships cannot predict the associated capillary pressures for these states. In this work, we demonstrate that the irreducible wetting-phase saturation is an artifact of the experimental design, caused by the fact that the boundary pressure difference does not approximate the true capillary pressure. Using averaging methods, we measure the true capillary pressure for fluid configurations at and below the irreducible wetting phase saturation. Results of our analysis include a state function for the capillary pressure expressed as a function of fluid saturation and interfacial area.


Author(s):  
Samir Bandyopadhyay Sr ◽  
SHAWNI DUTTA ◽  
SHAWNI DUTTA ◽  
SHAWNI DUTTA

BACKGROUND In recent days, Covid-19 coronavirus has been an immense impact on social, economic fields in the world. The objective of this study determines if it is feasible to use machine learning method to evaluate how much prediction results are close to original data related to Confirmed-Negative-Released-Death cases of Covid-19. For this purpose, a verification method is proposed in this paper that uses the concept of Deep-learning Neural Network. In this framework, Long short-term memory (LSTM) and Gated Recurrent Unit (GRU) are also assimilated finally for training the dataset and the prediction results are tally with the results predicted by clinical doctors. The prediction results are validated against the original data based on some predefined metric. The experimental results showcase that the proposed approach is useful in generating suitable results based on the critical disease outbreak. It also helps doctors to recheck further verification of virus by the proposed method. The outbreak of Coronavirus has the nature of exponential growth and so it is difficult to control with limited clinical persons for handling a huge number of patients with in a reasonable time. So it is necessary to build an automated model, based on machine learning approach, for corrective measure after the decision of clinical doctors. It could be a promising supplementary confirmation method for frontline clinical doctors. The proposed method has a high prediction rate and works fast for probable accurate identification of the disease. The performance analysis shows that a high rate of accuracy is obtained by the proposed method. OBJECTIVE Validation of COVID-19 disease METHODS Machine Learning RESULTS 90% CONCLUSIONS The combined LSTM-GRU based RNN model provides a comparatively better results in terms of prediction of confirmed, released, negative, death cases on the data. This paper presented a novel method that could recheck occurred cases of COVID-19 automatically. The data driven RNN based model is capable of providing automated tool for confirming, estimating the current position of this pandemic, assessing the severity, and assisting government and health workers to act for good decision making policy. It could be a promising supplementary rechecking method for frontline clinical doctors. It is now essential for improving the accuracy of detection process. CLINICALTRIAL 2020-04-03 3:22:36 PM


2021 ◽  
Vol 60 (6) ◽  
pp. 5779-5796
Author(s):  
Nashat Maher ◽  
G.A. Elsheikh ◽  
W.R. Anis ◽  
Tamer Emara

Author(s):  
Dhamanpreet Kaur ◽  
Matthew Sobiesk ◽  
Shubham Patil ◽  
Jin Liu ◽  
Puran Bhagat ◽  
...  

Abstract Objective This study seeks to develop a fully automated method of generating synthetic data from a real dataset that could be employed by medical organizations to distribute health data to researchers, reducing the need for access to real data. We hypothesize the application of Bayesian networks will improve upon the predominant existing method, medBGAN, in handling the complexity and dimensionality of healthcare data. Materials and Methods We employed Bayesian networks to learn probabilistic graphical structures and simulated synthetic patient records from the learned structure. We used the University of California Irvine (UCI) heart disease and diabetes datasets as well as the MIMIC-III diagnoses database. We evaluated our method through statistical tests, machine learning tasks, preservation of rare events, disclosure risk, and the ability of a machine learning classifier to discriminate between the real and synthetic data. Results Our Bayesian network model outperformed or equaled medBGAN in all key metrics. Notable improvement was achieved in capturing rare variables and preserving association rules. Discussion Bayesian networks generated data sufficiently similar to the original data with minimal risk of disclosure, while offering additional transparency, computational efficiency, and capacity to handle more data types in comparison to existing methods. We hope this method will allow healthcare organizations to efficiently disseminate synthetic health data to researchers, enabling them to generate hypotheses and develop analytical tools. Conclusion We conclude the application of Bayesian networks is a promising option for generating realistic synthetic health data that preserves the features of the original data without compromising data privacy.


Sign in / Sign up

Export Citation Format

Share Document