Studies on production, optimization and machine learning-based prediction of biosurfactant from Debaryomyces hansenii CBS767

Author(s):  
J. Kaur ◽  
S. Kaur ◽  
M. Kumar ◽  
P. Krishnan ◽  
J. Kaur ◽  
...  
2021 ◽  
Author(s):  
Subba Ramarao Rachapudi Venkata ◽  
Nagaraju Reddicharla ◽  
Shamma Saeed Alshehhi ◽  
Indra Utama ◽  
Saber Mubarak Al Nuimi ◽  
...  

Abstract Matured hydrocarbon fields are continuously deteriorating and selection of well interventions turn into critical task with an objective of achieving higher business value. Time consuming simulation models and classical decision-making approach making it difficult to rapidly identify the best underperforming, potential rig and rig-less candidates. Therefore, the objective of this paper is to demonstrate the automated solution with data driven machine learning (ML) & AI assisted workflows to prioritize the intervention opportunities that can deliver higher sustainable oil rate and profitability. The solution consists of establishing a customized database using inputs from various sources including production & completion data, flat files and simulation models. Automation of Data gathering along with technical and economical calculations were implemented to overcome the repetitive and less added value tasks. Second layer of solution includes configuration of tailor-made workflows to conduct the analysis of well performance, logs, output from simulation models (static reservoir model, well models) along with historical events. Further these workflows were combination of current best practices of an integrated assessment of subsurface opportunities through analytical computations along with machine learning driven techniques for ranking the well intervention opportunities with consideration of complexity in implementation. The automated process outcome is a comprehensive list of future well intervention candidates like well conversion to gas lift, water shutoff, stimulation and nitrogen kick-off opportunities. The opportunity ranking is completed with AI assisted supported scoring system that takes input from technical, financial and implementation risk scores. In addition, intuitive dashboards are built and tailored with the involvement of management and engineering departments to track the opportunity maturation process. The advisory system has been implemented and tested in a giant mature field with over 300 wells. The solution identified more techno-economical feasible opportunities within hours instead of weeks or months with reduced risk of failure resulting into an improved economic success rate. The first set of opportunities under implementation and expected a gain of 2.5MM$ with in first one year and expected to have reoccurring gains in subsequent years. The ranked opportunities are incorporated into the business plan, RMP plans and drilling & workover schedule in accordance to field development targets. This advisory system helps in maximizing the profitability and minimizing CAPEX and OPEX. This further maximizes utilization of production optimization models by 30%. Currently the system was implemented in one of ADNOC Onshore field and expected to be scaled to other fields based on consistent value creation. A hybrid approach of physics and machine learning based solution led to the development of automated workflows to identify and rank the inactive strings, well conversion to gas lift candidates & underperforming candidates resulting into successful cost optimization and production gain.


2021 ◽  
Author(s):  
Jimmy Thatcher ◽  
Abdul Rehman ◽  
Ivan Gee ◽  
Morgan Eldred

Abstract Oil & Gas extraction companies are using a vast amount of capital and expertise on production optimization. The scale and diversity of information required for analysis is massive and often leading to a prioritization between time and precision for the teams involved in the process. This paper provides a success story of how artificial intelligence (AI) is used to dynamically and effeciently optimize and predict production of gas wells. In particular, we focus on the application of unsupervised machine learning to identify under different potential constraints the optimal production parameter settings that can lead to maximum production. A machine learning model is supported by a decision support system that can enhance future drilling operations and also help answer important questions such as why a particular well or group of wells is producing differently than others of the same type or what kind of parameters that work on different wells in different conditions. The model can be advanced to optimize within field constraints such as facility handling capacity, quotas, budget or emmisions. The methods used were a combination of similarity measures and unsupervised machine learning techniques which were effective in identifying wells and clusters of wells that have similar production and behavioral profiles. The clusters of wells were then used to identify the process path (specific drilling and completion, choke size, chemicals, etc processes) most likely to result in optimal production and to identify the most impactful variables on production rate or cumulative production via an additional clustering of the principle charactersitics of the well. The data sets used to build these models include but are not limited to gas production data (daily volume), drilling data (well logs, fluid summary etc.), completion data (frac, cement bond logs), and pre-production testing data (choke, pressure etc.) Initial results indicate that this approach is a feasible approach, on target in terms of accuracy with traditional methods and represents a novel, data driven, method of identifying optimal parameter settings for desired production levels; with the ability to perform forecasts and optimization scenarios in run-time. The approach of using machine learning for production forecasting and production optimization in run-time has immense values in terms of the ability to augment domain expertise and create detailed studies in a fraction of the time that is typically required using traditional approaches. Building on same approach to optimise the field to deliver most reliable or most effeciently against a parameter will be an invaluable feature for overall asset optimisation.


2021 ◽  
Author(s):  
Anton Gryzlov ◽  
Sergey Safonov ◽  
Muhammad Arsalan

Abstract Monitoring of production rates is essential for reservoir management, history matching, and production optimization. Traditionally, such information is provided by multiphase flow meters or test separators. The growth of the availability of data, combined with the rapid development of computational resources, enabled the inception of digital techniques, which estimate oil, gas, and water rates indirectly. This paper discusses the application of continuous deep learning models, capable of reproducing multiphase flow dynamics for production monitoring purposes. This technique combines time evolution properties of a dynamical system and the ability of neural networks to quantitively describe poorly understood multiphase phenomena and can be considered as a hybrid solution between data-driven and mechanistic approaches. The continuous latent ordinary differential equation (Latent ODE) approach is compared to other known machine learning methods, such as linear regression, ensemble-based model, and recurrent neural network. In this work, the application of Latent ordinary differential equations for the problem of multiphase flow rate estimation is introduced. The considered example refers to a scenario, where the topside oil, gas, and water flow rates are estimated using the data from several downhole pressure sensors. The predictive capabilities of different types of machine learning and deep learning instruments are explored using simulated production data from a multiphase flow simulator. The results demonstrate the satisfactory performance of the continuous deep learning models in comparison to other machine learning methods in terms of accuracy, where the normalized root mean squared error (RMSE) and mean absolute error (MAE) of prediction below 5% were achieved. While LODE demonstrates the significant time required to train the model, it outperforms other methods for irregularly sampled time-series, which makes it especially attractive to forecast values of multiphase rates.


2021 ◽  
Author(s):  
Aulia Ahmad Naufal ◽  
Sabrina Metra

Abstract Production optimization on a network level has been proven to be an effective method to maximize production potential of a field with low capital. But as it stands, it is a heavy process to start along with its several challenges such as data quality issues, tedious plus repetitive work processes to deploy and reuse a complete network model. Leveraging technologies from a flow assurance simulator, python Application Programming Interface (API) toolkit, open-source machine learning packages in python, and a commercial visualization dashboard, this paper proposed a series of workflows to simplify model deployment and set up an automatic advisory system to provide insight as a mean to justify an engineer’s day to-day engineering decision. A total of three steps was prepared to achieve field-level automated optimization system. First, is the creation of digital twin of well and network model. To eliminate potential data errors, reduce time consumed, and to merge various part of the model into one, a scalable python script was made. Second, an automated calibration workflow is created as performance issues also arises for individual branch calibration matching. Hence a combination of technologies was utilized to automate daily data acquisition and model update from production database and run a supervised machine learning model to continuously calibrate the network model. The last one is creating the customizable optimization workflow based of field KPIs, which results are derived from daily optimization run. The results are available in a personalized network surveillance dashboard accessible for engineers to create rapid decisions. From the first and second steps, time consumed was reduced from 30 minutes/well to 10 minutes/well in bulk well modelling workflow and from 2 hours to 10 minutes for the network model merge with the assumption of 100 wells in one network. It would also greatly increase data integrity and consistency issues as it eliminates wearisome input process. On the last step, the model was successfully updated with the latest production data and the well IPRs’ Liquid PI, reservoir pressure, and holdup factor are predicted from ML with more than 90% accuracy. As result delivery, the surveillance dashboard will be populated daily with the network production data, flowing parameters, and operation recommendations. It is estimated more than 90% time is saved from manual individual runs to digital comprehensive optimization.


2020 ◽  
Author(s):  
Luca Cadei ◽  
Gianmarco Rossi ◽  
Marco Montini ◽  
Piero Fier ◽  
Diletta Milana ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document