Combined Application of Deep Boundary Detection Tool, Multilayer Data Inversion and 3D Visualization of Seismic Data in Real Time for Geosteering on the Oilfield in the Russian Federation

2021 ◽  
Author(s):  
Chingis Oshakbayev ◽  
Roman Romanov ◽  
Valentin Vlassenko ◽  
Simon Austin ◽  
Sergey Kovalev ◽  
...  

Abstract Currently drilling of horizontal wells is a common enhanced oil recovery method. Geosteering services are often used for accurate well placement, which makes it possible to achieve a significant increase in production at relatively low cost. This paper describes the result of using seismic data in three-dimensional visualization for high-quality geosteering using a deep boundary detection tool and multilayer inversion in real time. Crossing the top of the reservoir while drilling horizontal sections at the current oilfield is unacceptable, due to the presence of reactive mudstones. In case of crossing the top of reservoir, further work on running and installing the liner becomes impossible due to instability and may lead to well collapse. Based on prewell analysis of the structural data, the well was not supposed to approach the top of the target formation along the planned profile. However, while preparing geosteering model and analyzing seismic data it became possible to reveal that risk, elaborate its mitigation and eventually increase the length of the horizontal section. Such integrated analysis made it possible to maintain the wellbore within the target reservoirs, as well as to update the structural bedding of the top based on the multilayer inversion results.

2014 ◽  
Author(s):  
Mohamed S El-Hateel ◽  
Parvez Ahmad ◽  
Ahmed Hesham A Ismail ◽  
Islam A M Henaish ◽  
Ahmed Ashraf

2021 ◽  
Vol 3 (5) ◽  
Author(s):  
Ruissein Mahon ◽  
Gbenga Oluyemi ◽  
Babs Oyeneyin ◽  
Yakubu Balogun

Abstract Polymer flooding is a mature chemical enhanced oil recovery method employed in oilfields at pilot testing and field scales. Although results from these applications empirically demonstrate the higher displacement efficiency of polymer flooding over waterflooding operations, the fact remains that not all the oil will be recovered. Thus, continued research attention is needed to further understand the displacement flow mechanism of the immiscible process and the rock–fluid interaction propagated by the multiphase flow during polymer flooding operations. In this study, displacement sequence experiments were conducted to investigate the viscosifying effect of polymer solutions on oil recovery in sandpack systems. The history matching technique was employed to estimate relative permeability, fractional flow and saturation profile through the implementation of a Corey-type function. Experimental results showed that in the case of the motor oil being the displaced fluid, the XG 2500 ppm polymer achieved a 47.0% increase in oil recovery compared with the waterflood case, while the XG 1000 ppm polymer achieved a 38.6% increase in oil recovery compared with the waterflood case. Testing with the motor oil being the displaced fluid, the viscosity ratio was 136 for the waterflood case, 18 for the polymer flood case with XG 1000 ppm polymer and 9 for the polymer flood case with XG 2500 ppm polymer. Findings also revealed that for the waterflood cases, the porous media exhibited oil-wet characteristics, while the polymer flood cases demonstrated water-wet characteristics. This paper provides theoretical support for the application of polymer to improve oil recovery by providing insights into the mechanism behind oil displacement. Graphic abstract Highlights The difference in shape of relative permeability curves are indicative of the effect of mobility control of each polymer concentration. The water-oil systems exhibited oil-wet characteristics, while the polymer-oil systems demonstrated water-wet characteristics. A large contrast in displacing and displaced fluid viscosities led to viscous fingering and early water breakthrough.


2021 ◽  
Vol 11 (11) ◽  
pp. 4874
Author(s):  
Milan Brankovic ◽  
Eduardo Gildin ◽  
Richard L. Gibson ◽  
Mark E. Everett

Seismic data provides integral information in geophysical exploration, for locating hydrocarbon rich areas as well as for fracture monitoring during well stimulation. Because of its high frequency acquisition rate and dense spatial sampling, distributed acoustic sensing (DAS) has seen increasing application in microseimic monitoring. Given large volumes of data to be analyzed in real-time and impractical memory and storage requirements, fast compression and accurate interpretation methods are necessary for real-time monitoring campaigns using DAS. In response to the developments in data acquisition, we have created shifted-matrix decomposition (SMD) to compress seismic data by storing it into pairs of singular vectors coupled with shift vectors. This is achieved by shifting the columns of a matrix of seismic data before applying singular value decomposition (SVD) to it to extract a pair of singular vectors. The purpose of SMD is data denoising as well as compression, as reconstructing seismic data from its compressed form creates a denoised version of the original data. By analyzing the data in its compressed form, we can also run signal detection and velocity estimation analysis. Therefore, the developed algorithm can simultaneously compress and denoise seismic data while also analyzing compressed data to estimate signal presence and wave velocities. To show its efficiency, we compare SMD to local SVD and structure-oriented SVD, which are similar SVD-based methods used only for denoising seismic data. While the development of SMD is motivated by the increasing use of DAS, SMD can be applied to any seismic data obtained from a large number of receivers. For example, here we present initial applications of SMD to readily available marine seismic data.


2020 ◽  
Vol 91 (4) ◽  
pp. 2127-2140 ◽  
Author(s):  
Glenn Thompson ◽  
John A. Power ◽  
Jochen Braunmiller ◽  
Andrew B. Lockhart ◽  
Lloyd Lynch ◽  
...  

Abstract An eruption of the Soufrière Hills Volcano (SHV) on the eastern Caribbean island of Montserrat began on 18 July 1995 and continued until February 2010. Within nine days of the eruption onset, an existing four-station analog seismic network (ASN) was expanded to 10 sites. Telemetered data from this network were recorded, processed, and archived locally using a system developed by scientists from the U.S. Geological Survey (USGS) Volcano Disaster Assistance Program (VDAP). In October 1996, a digital seismic network (DSN) was deployed with the ability to capture larger amplitude signals across a broader frequency range. These two networks operated in parallel until December 2004, with separate telemetry and acquisition systems (analysis systems were merged in March 2001). Although the DSN provided better quality data for research, the ASN featured superior real-time monitoring tools and captured valuable data including the only seismic data from the first 15 months of the eruption. These successes of the ASN have been rather overlooked. This article documents the evolution of the ASN, the VDAP system, the original data captured, and the recovery and conversion of more than 230,000 seismic events from legacy SUDS, Hypo71, and Seislog formats into Seisan database with waveform data in miniSEED format. No digital catalog existed for these events, but students at the University of South Florida have classified two-thirds of the 40,000 events that were captured between July 1995 and October 1996. Locations and magnitudes were recovered for ∼10,000 of these events. Real-time seismic amplitude measurement, seismic spectral amplitude measurement, and tiltmeter data were also captured. The result is that the ASN seismic dataset is now more discoverable, accessible, and reusable, in accordance with FAIR data principles. These efforts could catalyze new research on the 1995–2010 SHV eruption. Furthermore, many observatories have data in these same legacy data formats and might benefit from procedures and codes documented here.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


2014 ◽  
Vol 169 ◽  
pp. 443-453 ◽  
Author(s):  
Jeremiah J. Shepherd ◽  
Lingxi Zhou ◽  
William Arndt ◽  
Yan Zhang ◽  
W. Jim Zheng ◽  
...  

More and more evidence indicates that the 3D conformation of eukaryotic genomes is a critical part of genome function. However, due to the lack of accurate and reliable 3D genome structural data, this information is largely ignored and most of these studies have to use information systems that view the DNA in a linear structure. Visualizing genomes in real time 3D can give researchers more insight, but this is fraught with hardware limitations since each element contains vast amounts of information that cannot be processed on the fly. Using a game engine and sophisticated video game visualization techniques enables us to construct a multi-platform real-time 3D genome viewer. The game engine-based viewer achieves much better rendering speed and can handle much larger amounts of data compared to our previous implementation using OpenGL. Combining this viewer with 3D genome models from experimental data could provide unprecedented opportunities to gain insight into the conformation–function relationships of a genome.


Sign in / Sign up

Export Citation Format

Share Document