A Robust Short-Term Oil Production under a Bow-Tie Uncertainty Set for the Gas Lift Performance Curve

SPE Journal ◽  
2021 ◽  
pp. 1-13
Author(s):  
André Ramos ◽  
Carlos Gamboa ◽  
Davi Valladão ◽  
Bernardo K. Pagnoncelli ◽  
Tito Homem-de-Mello ◽  
...  

Summary The method of continuous gas lift has been commonly used in the oil industry to enhance production. Existing optimization models consider an approximate performance curve anchored by production test data, often disregarding reservoir uncertainty. We propose a robust optimization model that jointly considers the most recent data and an uncertainty set for the reservoir pressure, a critical parameter that is usually not measured precisely. As a result, we obtain what we call a “bow-tie” uncertainty set for the performance curves, in which the performance uncertainty increases when we move away from the production test’s operational point. We test our model with real data from an offshore oil platform and compare it against a fully deterministic model. We show superior out-of-sample performance for the robust model under different probability distributions of the reservoir pressure.

2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Zahra Amini Farsani ◽  
Volker J. Schmid

AbstractCo-localization analysis is a popular method for quantitative analysis in fluorescence microscopy imaging. The localization of marked proteins in the cell nucleus allows a deep insight into biological processes in the nucleus. Several metrics have been developed for measuring the co-localization of two markers, however, they depend on subjective thresholding of background and the assumption of linearity. We propose a robust method to estimate the bivariate distribution function of two color channels. From this, we can quantify their co- or anti-colocalization. The proposed method is a combination of the Maximum Entropy Method (MEM) and a Gaussian Copula, which we call the Maximum Entropy Copula (MEC). This new method can measure the spatial and nonlinear correlation of signals to determine the marker colocalization in fluorescence microscopy images. The proposed method is compared with MEM for bivariate probability distributions. The new colocalization metric is validated on simulated and real data. The results show that MEC can determine co- and anti-colocalization even in high background settings. MEC can, therefore, be used as a robust tool for colocalization analysis.


2021 ◽  
Author(s):  
Mohammed Ahmed Al-Janabi ◽  
Omar F. Al-Fatlawi ◽  
Dhifaf J. Sadiq ◽  
Haider Abdulmuhsin Mahmood ◽  
Mustafa Alaulddin Al-Juboori

Abstract Artificial lift techniques are a highly effective solution to aid the deterioration of the production especially for mature oil fields, gas lift is one of the oldest and most applied artificial lift methods especially for large oil fields, the gas that is required for injection is quite scarce and expensive resource, optimally allocating the injection rate in each well is a high importance task and not easily applicable. Conventional methods faced some major problems in solving this problem in a network with large number of wells, multi-constrains, multi-objectives, and limited amount of gas. This paper focuses on utilizing the Genetic Algorithm (GA) as a gas lift optimization algorithm to tackle the challenging task of optimally allocating the gas lift injection rate through numerical modeling and simulation studies to maximize the oil production of a Middle Eastern oil field with 20 production wells with limited amount of gas to be injected. The key objective of this study is to assess the performance of the wells of the field after applying gas lift as an artificial lift method and applying the genetic algorithm as an optimization algorithm while comparing the results of the network to the case of artificially lifted wells by utilizing ESP pumps to the network and to have a more accurate view on the practicability of applying the gas lift optimization technique. The comparison is based on different measures and sensitivity studies, reservoir pressure, and water cut sensitivity analysis are applied to allow the assessment of the performance of the wells in the network throughout the life of the field. To have a full and insight view an economic study and comparison was applied in this study to estimate the benefits of applying the gas lift method and the GA optimization technique while comparing the results to the case of the ESP pumps and the case of naturally flowing wells. The gas lift technique proved to have the ability to enhance the production of the oil field and the optimization process showed quite an enhancement in the task of maximizing the oil production rate while using the same amount of gas to be injected in the each well, the sensitivity analysis showed that the gas lift method is comparable to the other artificial lift method and it have an upper hand in handling the reservoir pressure reduction, and economically CAPEX of the gas lift were calculated to be able to assess the time to reach a profitable income by comparing the results of OPEX of gas lift the technique showed a profitable income higher than the cases of naturally flowing wells and the ESP pumps lifted wells. Additionally, the paper illustrated the genetic algorithm (GA) optimization model in a way that allowed it to be followed as a guide for the task of optimizing the gas injection rate for a network with a large number of wells and limited amount of gas to be injected.


Author(s):  
Chi-Hua Chen ◽  
Fangying Song ◽  
Feng-Jang Hwang ◽  
Ling Wu

To generate a probability density function (PDF) for fitting probability distributions of real data, this study proposes a deep learning method which consists of two stages: (1) a training stage for estimating the cumulative distribution function (CDF) and (2) a performing stage for predicting the corresponding PDF. The CDFs of common probability distributions can be adopted as activation functions in the hidden layers of the proposed deep learning model for learning actual cumulative probabilities, and the differential equation of trained deep learning model can be used to estimate the PDF. To evaluate the proposed method, numerical experiments with single and mixed distributions are performed. The experimental results show that the values of both CDF and PDF can be precisely estimated by the proposed method.


2021 ◽  
Vol 1 ◽  
pp. 67-74
Author(s):  
Iwan Febrianto ◽  
Nelson Saksono

The Gas Gathering Station (GGS) in field X processes gas from 16 (sixteen) wells before being sent as selling gas to consumers. The sixteen wells have decreased in good pressure since 2011, thus affecting the performance of the Acid Gas Removal Unit (AGRU). The GGS consists of 4 (four) main units, namely the Manifold Production/ Test, the Separation Unit, the Acid Gas Removal Unit (AGRU), the Dehydration Unit (DHU). The AGRU facility in field X is designed to reduce the acid gas content of CO2 by 21 mol% with a feed gas capacity of 85 MMSCFD. A decrease in reservoir pressure caused an increase in the feed gas temperature and an increase in the water content of the well. Based on the reconstruction of the design conditions into the simulation model, the amine composition consisting of MDEA 0.3618 and MEA 0.088 wt fraction to obtain the percentage of CO2 in the 5% mol sales gas. The increase in feed gas temperature up to 146 F caused foaming due to condensation of heavy hydrocarbon fraction, so it was necessary to modify it by adding a chiller to cool the feed gas to become 60 F. Based on the simulation, the flow rate of gas entering AGRU could reach 83.7 MMSCFD. There was an increase in gas production of 38.1 MMSCFD and condensate of 1,376 BPD. Economically, the addition of a chiller modification project was feasible with the economical parameters of NPV US$ 132,000,000, IRR 348.19%, POT 0.31 year and PV ratio 19.06.


Author(s):  
Giuseppe Buccheri ◽  
Fulvio Corsi

Abstract Despite their effectiveness, linear models for realized variance neglect measurement errors on integrated variance and exhibit several forms of misspecification due to the inherent nonlinear dynamics of volatility. We propose new extensions of the popular approximate long-memory heterogeneous autoregressive (HAR) model apt to disentangle these effects and quantify their separate impact on volatility forecasts. By combining the asymptotic theory of the realized variance estimator with the Kalman filter and by introducing time-varying HAR parameters, we build new models that account for: (i) measurement errors (HARK), (ii) nonlinear dependencies (SHAR) and (iii) both measurement errors and nonlinearities (SHARK). The proposed models are simply estimated through standard maximum likelihood methods and are shown, both on simulated and real data, to provide better out-of-sample forecasts compared to standard HAR specifications and other competing approaches.


2006 ◽  
Vol 134 (10) ◽  
pp. 3006-3014 ◽  
Author(s):  
James A. Hansen ◽  
Cecile Penland

Abstract The delicate (and computationally expensive) nature of stochastic numerical modeling naturally leads one to look for efficient and/or convenient methods for integrating stochastic differential equations. Concomitantly, one may wish to sensibly add stochastic terms to an existing deterministic model without having to rewrite that model. In this note, two possibilities in the context of the fourth-order Runge–Kutta (RK4) integration scheme are examined. The first approach entails a hybrid of deterministic and stochastic integration schemes. In these examples, the hybrid RK4 generates time series with the correct climatological probability distributions. However, it is doubtful that the resulting time series are approximate solutions to the stochastic equations at every time step. The second approach uses the standard RK4 integration method modified by appropriately scaling stochastic terms. This is shown to be a special case of the general stochastic Runge–Kutta schemes considered by Ruemelin and has global convergence of order one. Thus, it gives excellent results for cases in which real noise with small but finite correlation time is approximated as white. This restriction on the type of problems to which the stochastic RK4 can be applied is strongly compensated by its computational efficiency.


2017 ◽  
Vol 69 (2) ◽  
pp. 150-164 ◽  
Author(s):  
Benmei Liu ◽  
Partha Lahiri

Unit-level logistic regression models with mixed effects have been used for estimating small area proportions in the literature. Normality is commonly assumed for the random effects. Nonetheless, real data often show significant departures from normality assumptions of the random effects. To reduce the risk of model misspecification, we propose an adaptive hierarchical Bayes estimation approach in which the distribution of the random effect is chosen adaptively from the exponential power class of probability distributions. The richness of the exponential power class ensures the robustness of our hierarchical Bayes approach against departure from normality. We demonstrate the robustness of our proposed model using both simulated and real data. The results suggest that the proposed model works reasonably well to incorporate potential kurtosis of the random effects distribution.


2010 ◽  
Vol 22 (7) ◽  
pp. 1718-1736 ◽  
Author(s):  
Shun-ichi Amari

Analysis of correlated spike trains is a hot topic of research in computational neuroscience. A general model of probability distributions for spikes includes too many parameters to be of use in analyzing real data. Instead, we need a simple but powerful generative model for correlated spikes. We developed a class of conditional mixture models that includes a number of existing models and analyzed its capabilities and limitations. We apply the model to dynamical aspects of neuron pools. When Hebbian cell assemblies coexist in a pool of neurons, the condition is specified by these assemblies such that the probability distribution of spikes is a mixture of those of the component assemblies. The probabilities of activation of the Hebbian assemblies change dynamically. We used this model as a basis for a competitive model governing the states of assemblies.


Author(s):  
Yuri Popkov ◽  
Yuri Dubnov ◽  
Alexey Popkov

The paper is devoted to the forecasting of the COVID-19 epidemic by the novel method of randomized machine learning. This method is based on the idea of estimation of probability distributions of model parameters and noises on real data. Entropy-optimal distributions correspond to the state of maximum uncertainty which allows the resulting forecasts to be used as forecasts of the most "negative" scenario of the process under study. The resulting estimates of parameters and noises, which are probability distributions, must be generated, thus obtaining an ensemble of trajectories that considered to be analyzed by statistical methods. In this work, for the purposes of such an analysis, the mean and median trajectories over the ensemble are calculated, as well as the trajectory corresponding to the mean over distribution values of the model parameters. The proposed approach is used to predict the total number of infected people using a three-parameter logistic growth model. The conducted experiment is based on real COVID-19 epidemic data in several countries of the European Union. The main goal of the experiment is to demonstrate an entropy-randomized approach for predicting the epidemic process based on real data near the peak. The significant uncertainty contained in the available real data is modeled by an additive noise within 30%, which is used both at the training and predicting stages. To tune the hyperparameters of the model, the scheme is used to configure them according to a testing dataset with subsequent retraining of the model. It is shown that with the same datasets, the proposed approach makes it possible to predict the development of the epidemic more efficiently in comparison with the standard approach based on the least-squares method.


Sign in / Sign up

Export Citation Format

Share Document