scholarly journals Imaging of phase changes and fluid movement in and between reservoirs at Teal South

2015 ◽  
Vol 3 (2) ◽  
pp. SP53-SP65 ◽  
Author(s):  
Nayyer Islam ◽  
Mohamed A. Ezawi ◽  
Wayne D. Pennington

We have reexamined the poststack seismic legacy and time-lapse data sets from the Teal South field in the Gulf of Mexico for insight into regional pressure changes from production at one reservoir and its effects on neighboring unproduced reservoirs. We support previous predictions of oil and gas leakage from neighboring reservoirs by providing direct evidence for leakage through 3D mapping of the hydrocarbons themselves. The use of the squared instantaneous amplitude as an attribute allowed visualization of the large amplitude changes while minimizing the appearance of noise. The use of translucency in the 3D time-lapse difference volumes assisted in identifying features of interest that had been unrecognized in earlier studies. For example, this investigation found that hydrocarbons appeared to have escaped from one small (unproduced) reservoir through its spill point, only to be trapped in a nearby structure, from which it ultimately escaped through that trap’s spill point. Such fluid migration can occur in a period of a few years due to production, not geologic time. Time-lapse studies such as the one presented here can be very helpful in identifying such fluid movement, particularly in highly porous and unconsolidated reservoirs that are highly sensitive to pore-fluid type and stress changes.

Geophysics ◽  
2003 ◽  
Vol 68 (5) ◽  
pp. 1592-1599 ◽  
Author(s):  
Martin Landrø ◽  
Helene Hafslund Veire ◽  
Kenneth Duffaut ◽  
Nazih Najjar

Explicit expressions for computation of saturation and pressure‐related changes from marine multicomponent time‐lapse seismic data are presented. Necessary input is PP and PS stacked data for the baseline seismic survey and the repeat survey. Compared to earlier methods based on PP data only, this method is expected to be more robust since two independent measurements are used in the computation. Due to a lack of real marine multicomponent time‐lapse seismic data sets, the methodology is tested on synthetic data sets, illustrating strengths and weaknesses of the proposed technique. Testing ten scenarios for various changes in pore pressure and fluid saturation, we find that it is more robust for most cases to use the proposed 4D PP/PS technique instead of a 4D PP amplitude variation with offset (AVO) technique. The fit between estimated and “real” changes in water saturation and pore pressure were good for most cases. On the average, we find that the deviation in estimated saturation changes is 8% and 0.3 MPa for the estimated pore pressure changes. For PP AVO, we find that the corresponding average errors are 9% and 1.0 MPa. In the present method, only 4D PP and PS amplitude changes are used in the calculations. It is straightforward to include use of 4D traveltime shifts in the algorithm and, if reliable time shifts can be measured, this will most likely further stabilize the presented method.


2021 ◽  
Author(s):  
Alexis Shakas ◽  
Nima Gholizadeh ◽  
Marian Hertrich ◽  
Quinn Wenning ◽  
Hansruedi Maurer ◽  
...  

<p>The Bedretto Underground Laboratory for Geosciences and GeoEnergies, located in the Swiss Alps and situated under more than 1 km of granitic overburden, offers a unique field site to study processes in fractured rock. Currently, a total of six boreholes are available, four of them being permanently instrumented with monitoring equipment, and two dedicated as stimulation boreholes. One of the monitoring boreholes contains permanent packed-off intervals which record pressure changes and flow rate. The remaining three are instrumented with a variety of sensors, including fiber-optic micro-strain sensors, temperature monitoring, permanent geophones and accelerometers. All monitoring boreholes are either sealed with packers or cemented, and only the stimulation boreholes allow for outflow. During a period of several weeks, we were able to seal the two stimulation boreholes and allow the reservoir to approach ambient pressure conditions (more than 3 MPa at the wellhead) while we monitored the response of the reservoir. The pressure buildup shows not only in the pressure data, but also as stress changes in the reservoir. During a depressurization phase, we quickly opened one borehole and subsequently performed time-lapse single-hole Ground Penetrating Radar (GPR) measurements. At a second depressurization phase, we continued the GPR measurements while opening the second borehole in a controlled manner. The changes in strain, pressure and GPR reflectivity illuminate the response of the reservoir when moving from ambient to atmospheric pressure at the wellhead, and reveal processes such as wellbore storage, pore-pressure variations and ultimately permeability changes in the reservoir.</p>


2009 ◽  
pp. 18-31
Author(s):  
G. Rapoport ◽  
A. Guerts

In the article the global crisis of 2008-2009 is considered as superposition of a few regional crises that occurred simultaneously but for different reasons. However, they have something in common: developed countries tend to maintain a strong level of social security without increasing the real production output. On the one hand, this policy has resulted in trade deficit and partial destruction of market mechanisms. On the other hand, it has clashed with the desire of several oil and gas exporting countries to receive an exclusive price for their energy resources.


2021 ◽  
Author(s):  
Jakob Raymaekers ◽  
Peter J. Rousseeuw

AbstractMany real data sets contain numerical features (variables) whose distribution is far from normal (Gaussian). Instead, their distribution is often skewed. In order to handle such data it is customary to preprocess the variables to make them more normal. The Box–Cox and Yeo–Johnson transformations are well-known tools for this. However, the standard maximum likelihood estimator of their transformation parameter is highly sensitive to outliers, and will often try to move outliers inward at the expense of the normality of the central part of the data. We propose a modification of these transformations as well as an estimator of the transformation parameter that is robust to outliers, so the transformed data can be approximately normal in the center and a few outliers may deviate from it. It compares favorably to existing techniques in an extensive simulation study and on real data.


2020 ◽  
pp. 1-17
Author(s):  
Francisco Javier Balea-Fernandez ◽  
Beatriz Martinez-Vega ◽  
Samuel Ortega ◽  
Himar Fabelo ◽  
Raquel Leon ◽  
...  

Background: Sociodemographic data indicate the progressive increase in life expectancy and the prevalence of Alzheimer’s disease (AD). AD is raised as one of the greatest public health problems. Its etiology is twofold: on the one hand, non-modifiable factors and on the other, modifiable. Objective: This study aims to develop a processing framework based on machine learning (ML) and optimization algorithms to study sociodemographic, clinical, and analytical variables, selecting the best combination among them for an accurate discrimination between controls and subjects with major neurocognitive disorder (MNCD). Methods: This research is based on an observational-analytical design. Two research groups were established: MNCD group (n = 46) and control group (n = 38). ML and optimization algorithms were employed to automatically diagnose MNCD. Results: Twelve out of 37 variables were identified in the validation set as the most relevant for MNCD diagnosis. Sensitivity of 100%and specificity of 71%were achieved using a Random Forest classifier. Conclusion: ML is a potential tool for automatic prediction of MNCD which can be applied to relatively small preclinical and clinical data sets. These results can be interpreted to support the influence of the environment on the development of AD.


2012 ◽  
Vol 6 (5) ◽  
pp. 1141-1155 ◽  
Author(s):  
B. R. Pinzer ◽  
M. Schneebeli ◽  
T. U. Kaempfer

Abstract. Dry snow metamorphism under an external temperature gradient is the most common type of recrystallization of snow on the ground. The changes in snow microstructure modify the physical properties of snow, and therefore an understanding of this process is essential for many disciplines, from modeling the effects of snow on climate to assessing avalanche risk. We directly imaged the microstructural changes in snow during temperature gradient metamorphism (TGM) under a constant gradient of 50 K m−1, using in situ time-lapse X-ray micro-tomography. This novel and non-destructive technique directly reveals the amount of ice that sublimates and is deposited during metamorphism, in addition to the exact locations of these phase changes. We calculated the average time that an ice volume stayed in place before it sublimated and found a characteristic residence time of 2–3 days. This means that most of the ice changes its phase from solid to vapor and back many times in a seasonal snowpack where similar temperature conditions can be found. Consistent with such a short timescale, we observed a mass turnover of up to 60% of the total ice mass per day. The concept of hand-to-hand transport for the water vapor flux describes the observed changes very well. However, we did not find evidence for a macroscopic vapor diffusion enhancement. The picture of {temperature gradient metamorphism} produced by directly observing the changing microstructure sheds light on the micro-physical processes and could help to improve models that predict the physical properties of snow.


2021 ◽  
Vol 30 (5) ◽  
pp. 58-65
Author(s):  
A. Yu. Shebeko ◽  
Yu. N. Shebeko ◽  
A. V. Zuban

Introduction. GOST R 12.3.047-2012 standard offers a methodology for determination of required fire resistance limits of engineering structures. This methodology is based on a comparison of values of the fire resistance limit and the equivalent fire duration. However, in practice incidents occur when, in absence of regulatory fire resistance requirements, a facility owner, who has relaxed the fire resistance requirements prescribed by GOST R 12.3.047–2012, is ready to accept its potential loss in fire for economic reasons. In this case, one can apply the probability of safe evacuation and rescue to compare distributions of fire resistance limits, on the one hand, and evacuation and rescue time, on the other hand.A methodology for the identification of required fire resistance limits. The probabilistic method for the identification of required fire resistance limits, published in work [1], was tested in this study. This method differs from the one specified in GOST R 12.3.047-2012. The method is based on a comparison of distributions of such random values, as the estimated time of evacuation or rescue in case of fire at a production facility and fire resistance limits for engineering structures.Calculations of required fire resistance limits. This article presents a case of application of the proposed method to the rescue of people using the results of full-scale experiments, involving a real pipe rack at a gas processing plant [2].Conclusions. The required fire resistance limits for pipe rack structures of a gas processing plant were identified. The calculations took account of the time needed to evacuate and rescue the personnel, as well as the pre-set reliability of structures, given that the personnel evacuation and rescue time in case of fire is identified in an experiment.


2021 ◽  
Vol 48 (4) ◽  
pp. 307-328
Author(s):  
Dominic Farace ◽  
Hélène Prost ◽  
Antonella Zane ◽  
Birger Hjørland ◽  
◽  
...  

This article presents and discusses different kinds of data documents, including data sets, data studies, data papers and data journals. It provides descriptive and bibliometric data on different kinds of data documents and discusses the theoretical and philosophical problems by classifying documents according to the DIKW model (data documents, information documents, knowl­edge documents and wisdom documents). Data documents are, on the one hand, an established category today, even with its own data citation index (DCI). On the other hand, data documents have blurred boundaries in relation to other kinds of documents and seem sometimes to be understood from the problematic philosophical assumption that a datum can be understood as “a single, fixed truth, valid for everyone, everywhere, at all times”


1996 ◽  
Vol 118 (4) ◽  
pp. 284-291 ◽  
Author(s):  
C. Guedes Soares ◽  
A. C. Henriques

This work examines some aspects involved in the estimation of the parameters of the probability distribution of significant wave height, in particular the homogeneity of the data sets and the statistical methods of fitting a distribution to data. More homogeneous data sets are organized by collecting the data on a monthly basis and by separating the simple sea states from the combined ones. A three-parameter Weibull distribution is fitted to the data. The parameters of the fitted distribution are estimated by the methods of maximum likelihood, of regression, and of the moments. The uncertainty involved in estimating the probability distribution with the three methods is compared with the one that results from using more homogeneous data sets, and it is concluded that the uncertainty involved in the fitting procedure can be more significant unless the method of moments is not considered.


Sign in / Sign up

Export Citation Format

Share Document