Theory-Guided Data Science, A Petrophysical Case Study from the Diyab Formation

2021 ◽  
Author(s):  
Nicolas Leseur ◽  
Alfredo Mendez ◽  
Muhammad Zeeshan Baig ◽  
Pierre-Olivier Goiran

Abstract A practical example of a theory-guided data science case study is presented to evaluate the potential of the Diyab formation, an Upper Jurassic interval, source rock of some of the largest reservoirs in the Arabian Peninsula. A workflow base on a three-step approach combining the physics of logging tool response and a probabilistic machine-learning algorithm was undertaken to evaluate four wells of the prospect. At first, a core-calibrated multi-mineral model was established on a concept well for which an extensive suite of logs and core measurements had been acquired. To transfer the knowledge gained from the latter physics-driven interpretation onto the other data-scarce wells, the relationship between the output rock and fluid volumes and their input log responses was then learned by means of a Gaussian Process Regression (GPR). Finally, once trained on the key well, the latter probabilistic algorithm was deployed on the three remaining wells to predict reservoir properties, quantify resource potential and estimate volumetric-related uncertainties. The physics-informed machine-learning approach introduced in this work was found to provide results which matches with the majority of the available core data, while discrepancies could generally be explained by the occurrence of laminations which thickness are under the resolution of nuclear logs. Overall, the GPR approach seems to enable an efficient transfer of knowledge from data-rich key wells to other data-scarce wells. As opposed to a more conventional formation evaluation process which is carried out more independently from the key well, the present approach ensures that the final petrophysical interpretation reflects and benefits from the insights and the physics-driven coherency achieved at key well location.

Author(s):  
Wang Han ◽  
Xiaoling Zhang ◽  
Xiesi Huang ◽  
Haiqing Li

This paper presents a time-dependent reliability estimation method for engineering system based on machine learning and simulation method. Due to the stochastic nature of the environmental loads and internal incentive, the physics of failure for mechanical system is complex, and it is challenging to include uncertainties for the physical modeling of failure in the engineered system’s life cycle. In this paper, an efficient time-dependent reliability assessment framework for mechanical system is proposed using a machine learning algorithm considering stochastic dynamic loads in the mechanical system. Firstly, stochastic external loads of mechanical system are analyzed, and the finite element model is established. Secondly, the physics of failure mode of mechanical system at a time location is analyzed, and the distribution of time realization under each load condition is calculated. Then, the distribution of fatigue life can be obtained based on high-cycle fatigue theory. To reduce the calculation cost, a machine learning algorithm is utilized for physical modeling of failure by integrating uniform design and Gaussian process regression. The probabilistic fatigue life of gear transmission system under different load conditions can be calculated, and the time-varying reliability of mechanical system is further evaluated. Finally, numerical examples and the fatigue reliability estimation of gear transmission system is presented to demonstrate the effectiveness of the proposed method.


Author(s):  
Sachin Dev Suresh ◽  
Ali Qasim ◽  
Bhajan Lal ◽  
Syed Muhammad Imran ◽  
Khor Siak Foo

The production of oil and natural gas contributes to a significant amount of revenue generation in Malaysia thereby strengthening the country’s economy. The flow assurance industry is faced with impediments during smooth operation of the transmission pipeline in which gas hydrate formation is the most important. It affects the normal operation of the pipeline by plugging it. Under high pressure and low temperature conditions, gas hydrate is a crystalline structure consisting of a network of hydrogen bonds between host molecules of water and guest molecules of the incoming gases. Industry uses different types of chemical inhibitors in pipeline to suppress hydrate formation. To overcome this problem, machine learning algorithm has been introduced as part of risk management strategies. The objective of this paper is to utilize Machine Learning (ML) model which is Gaussian Process Regression (GPR). GPR is a new approach being applied to mitigate the growth of gas hydrate. The input parameters used are concentration and pressure of Carbon Dioxide (CO2) and Methane (CH4) gas hydrates whereas the output parameter is the Average Depression Temperature (ADT). The values for the parameter are taken from available data sets that enable GPR to predict the results accurately in terms of Coefficient of Determination, R2 and Mean Squared Error, MSE. The outcome from the research showed that GPR model provided with highest R2 value for training and testing data of 97.25% and 96.71%, respectively. MSE value for GPR was also found to be lowest for training and testing data of 0.019 and 0.023, respectively.


2020 ◽  
Vol 10 (1) ◽  
pp. 1-12
Author(s):  
Noura A. AlSomaikhi ◽  
Zakarya A. Alzamil

Microblogging platforms, such as Twitter, have become a popular interaction media that are used widely for different daily purposes, such as communication and knowledge sharing. Understanding the behaviors and interests of these platforms' users become a challenge that can help in different areas such as recommendation and filtering. In this article, an approach is proposed for classifying Twitter users with respect to their interests based on their Arabic tweets. A Multinomial Naïve Bayes machine learning algorithm is used for such classification. The proposed approach has been developed as a web-based software system that is integrated with Twitter using Twitter API. An experimental study on Arabic tweets has been investigated on the proposed system as a case study.


Author(s):  
Simen Eldevik ◽  
Stian Sætre ◽  
Erling Katla ◽  
Andreas B. Aardal

Abstract Operators of offshore floating drilling units have limited time to decide on whether a drilling operation can continue as planned or if it needs to be postponed or aborted due to oncoming bad weather. With day-rates of several hundred thousand USD, small delays in the original schedule might amass to considerable costs. On the other hand, pushing the limits of the load capacity of the riser-stack and wellhead may compromise the integrity of the well itself, and such a failure is not an option. Advanced simulation techniques may reduce uncertainty about how different weather scenarios influence the system’s integrity, and thus increase the acceptable weather window considerably. However, real-time simulations are often not feasible and the stochastic behavior of wave-loads make it difficult to simulate all relevant weather scenarios prior to the operation. This paper outlines and demonstrates an approach which utilizes probabilistic machine learning techniques to effectively reduce uncertainty. More specifically we use Gaussian process regression to enable fast approximation of the relevant structural response from complex simulations. The probabilistic nature of the method adds the benefit of an estimated uncertainty in the prediction which can be utilized to optimize how the initial set of relevant simulation scenarios should be selected, and to predict real-time estimates of the utilization and its uncertainty when combined with current weather forecasts. This enables operators to have an up-to-date forecast of the system’s utilization, as well as sufficient time to trigger additional scenario-specific simulation(s) to reduce the uncertainty of the current situation. As a result, it reduces unnecessary conservatism and gives clear decision support for critical situations.


2019 ◽  
Vol 9 (15) ◽  
pp. 3037 ◽  
Author(s):  
Isaac Machorro-Cano ◽  
Giner Alor-Hernández ◽  
Mario Andrés Paredes-Valverde ◽  
Uriel Ramos-Deonati ◽  
José Luis Sánchez-Cervantes ◽  
...  

Overweight and obesity are affecting productivity and quality of life worldwide. The Internet of Things (IoT) makes it possible to interconnect, detect, identify, and process data between objects or services to fulfill a common objective. The main advantages of IoT in healthcare are the monitoring, analysis, diagnosis, and control of conditions such as overweight and obesity and the generation of recommendations to prevent them. However, the objects used in the IoT have limited resources, so it has become necessary to consider other alternatives to analyze the data generated from monitoring, analysis, diagnosis, control, and the generation of recommendations, such as machine learning. This work presents PISIoT: a machine learning and IoT-based smart health platform for the prevention, detection, treatment, and control of overweight and obesity, and other associated conditions or health problems. Weka API and the J48 machine learning algorithm were used to identify critical variables and classify patients, while Apache Mahout and RuleML were used to generate medical recommendations. Finally, to validate the PISIoT platform, we present a case study on the prevention of myocardial infarction in elderly patients with obesity by monitoring biomedical variables.


Author(s):  
Satwik P M and Dr. Meenatchi Sundram

In this Research article, we presented a new approach for predicting the flood through the advanced Machine learning Algorithm which is one among the Neural networks class that outperforms itself in best data operations and predictive analytics. This Research article discusses in detail about the prediction of flood occurrences evaluation process. We interpreted the Research with many algorithms that is existing, and the Research work have been dealing with different research works inculcated and compared with different Research approaches. On Comparing to the Previous Researches its observed that the Neural Turing networks have been performing the prediction of the rainfall and flood-based disasters for the consecutive year counts of 10,15 and 20 with 93.8% accuracy. Here the Research is analyzed with various parameters and Comparing it with the other researches which is implemented with other machine learning algorithms. Comparing with the previous researches the Idea of the research have been described and evaluated with the different evaluation parameters including the number of iterations or Epochs.


Sign in / Sign up

Export Citation Format

Share Document