Correcting source and receiver scaling for virtual source imaging and monitoring

Geophysics ◽  
2018 ◽  
Vol 83 (3) ◽  
pp. Q15-Q24 ◽  
Author(s):  
Andrey Bakulin ◽  
Dmitry Alexandrov ◽  
Christos Saragiotis ◽  
Abdullah Al Ramadan ◽  
Boris Kashtan

Virtual source redatuming is a data-driven interferometric approach that relies on constructive and destructive interference, and as a result it is quite sensitive to input seismic trace amplitudes. Land surveys are prone to amplitude changes that are unrelated to subsurface geology (source/receiver coupling, etc.). We have determined that such variations may be particularly damaging to construct a virtual-source signal for imaging and seismic monitoring applications, and they need to be correctly compensated before satisfactory images, repeatability, and proper relative amplitudes are achieved. We examine two methods to correct for these variations: a redatuming approach based on multidimensional deconvolution and multisurvey surface-consistent (SC) scaling. Using synthetic data, we discover that the first approach can only balance time-dependent variations between repeat surveys, e.g., compensate for variable shot scaling. In contrast, a multisurvey SC approach can compensate for shot and receiver scaling within each survey and among the surveys. As a result, it eliminates redatuming artifacts, brings repeat surveys to a common amplitude level, while preserving relative amplitudes required for quantitative interpretation of 4D amplitude differences. Applying an SC approach to a land time-lapse field data set with buried receivers from Saudi Arabia, we additionally conclude that separate SC scaling of early arrivals and deep reflections may produce better image and repeatability. This is likely due to the significantly different frequency content of early arrivals and deep reflections.

Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. C81-C92 ◽  
Author(s):  
Helene Hafslund Veire ◽  
Hilde Grude Borgos ◽  
Martin Landrø

Effects of pressure and fluid saturation can have the same degree of impact on seismic amplitudes and differential traveltimes in the reservoir interval; thus, they are often inseparable by analysis of a single stacked seismic data set. In such cases, time-lapse AVO analysis offers an opportunity to discriminate between the two effects. We quantify the uncertainty in estimations to utilize information about pressure- and saturation-related changes in reservoir modeling and simulation. One way of analyzing uncertainties is to formulate the problem in a Bayesian framework. Here, the solution of the problem will be represented by a probability density function (PDF), providing estimations of uncertainties as well as direct estimations of the properties. A stochastic model for estimation of pressure and saturation changes from time-lapse seismic AVO data is investigated within a Bayesian framework. Well-known rock physical relationships are used to set up a prior stochastic model. PP reflection coefficient differences are used to establish a likelihood model for linking reservoir variables and time-lapse seismic data. The methodology incorporates correlation between different variables of the model as well as spatial dependencies for each of the variables. In addition, information about possible bottlenecks causing large uncertainties in the estimations can be identified through sensitivity analysis of the system. The method has been tested on 1D synthetic data and on field time-lapse seismic AVO data from the Gullfaks Field in the North Sea.


Geophysics ◽  
2010 ◽  
Vol 75 (3) ◽  
pp. SA37-SA43 ◽  
Author(s):  
Joongmoo Byun ◽  
Jeongmin Yu ◽  
Soon Jee Seol

Time-lapse crosswell seismic provides an efficient way to monitor the migration of a [Formula: see text] plume or its leakage after [Formula: see text] injection into a geologic formation. Recently, crosswell seismic has become a powerful tool for monitoring underground variations, using the concept of a virtual source, with virtual sources positioned at the receivers installed in the well and thus the positions of sources and receivers can be invariant during monitoring. However, time-lapse crosswell seismic using vertical wells and virtual sources has difficulty in describing the front of a [Formula: see text] plume, which usually is parallel to the vertical wells, and in obtaining sufficient ray coverage for the first-arrival tomography. These problems arise because of the theoretical downward-illumination-directivity limitation of the virtual source. We have developed an effective monitoring method that uses virtual sources and two horizontal wells: one above and one below the [Formula: see text]sequestration reservoir. In our method, we redatum the traces that are recorded at geophones in horizontal wells from sources on the surface. The redatumed traces then become virtual traces recorded at geophones in the lower well and sent from virtual sources at the positions of the geophones in the upper well. The geometry of our method has advantages for locating the front of the [Formula: see text] plume, which is normal to the horizontal wells, compared with either real or virtual sources. The method also is advantageous in acquiring full ray coverage between the wells, and that coverage is superior to coverage acquired using vertical crosswell seismic with virtual sources. In addition, we can avoid problems related to any potential change in the medium above the reservoir and in the source and receiver positions. The results of applying our method to synthetic data that simulate [Formula: see text]-sequestration monitoring show that the front of a [Formula: see text] plume in the reservoir is depicted accurately in a velocity tomogram. The new method also can be used to monitor a reservoir during production of heavy oil.


Geophysics ◽  
2013 ◽  
Vol 78 (5) ◽  
pp. M29-M41 ◽  
Author(s):  
Mahdi H. Almutlaq ◽  
Gary F. Margrave

We evaluated the concept of surface-consistent matching filters for processing time-lapse seismic data, in which matching filters are convolutional filters that minimize the sum-squared error between two signals. Because in the Fourier domain a matching filter is the spectral ratio of the two signals, we extended the well-known surface-consistent hypothesis such that the data term is a trace-by-trace spectral ratio of two data sets instead of only one (i.e., surface-consistent deconvolution). To avoid unstable division of spectra, we computed the spectral ratios in the time domain by first designing trace-sequential, least-squares matching filters, then Fourier transforming them. A subsequent least-squares solution then factored the trace-sequential matching filters into four operators: two surface-consistent (source and receiver) and two subsurface-consistent (offset and midpoint). We evaluated a time-lapse synthetic data set with nonrepeatable acquisition parameters, complex near-surface geology, and a variable subsurface reservoir layer. We computed the four-operator surface-consistent matching filters from two surveys, baseline and monitor, then applied these matching filters to the monitor survey to match it to the baseline survey over a temporal window where changes were not expected. This algorithm significantly reduced the effect of most of the nonrepeatable parameters, such as differences in source strength, receiver coupling, wavelet bandwidth and phase, and static shifts. We computed the normalized root mean square difference on raw stacked data (baseline and monitor) and obtained a mean value of 70%. This value was significantly reduced after applying the 4C surface-consistent matching filters to about 13.6% computed from final stacks.


Geophysics ◽  
2007 ◽  
Vol 72 (4) ◽  
pp. V79-V86 ◽  
Author(s):  
Kurang Mehta ◽  
Andrey Bakulin ◽  
Jonathan Sheiman ◽  
Rodney Calvert ◽  
Roel Snieder

The virtual source method has recently been proposed to image and monitor below complex and time-varying overburden. The method requires surface shooting recorded at downhole receivers placed below the distorting or changing part of the overburden. Redatuming with the measured Green’s function allows the reconstruction of a complete downhole survey as if the sources were also buried at the receiver locations. There are still some challenges that need to be addressed in the virtual source method, such as limited acquisition aperture and energy coming from the overburden. We demonstrate that up-down wavefield separation can substantially improve the quality of virtual source data. First, it allows us to eliminate artifacts associated with the limited acquisition aperture typically used in practice. Second, it allows us to reconstruct a new optimized response in the absence of downgoing reflections and multiples from the overburden. These improvements are illustrated on a synthetic data set of a complex layered model modeled after the Fahud field in Oman, and on ocean-bottom seismic data acquired in the Mars field in the deepwater Gulf of Mexico.


Geophysics ◽  
2009 ◽  
Vol 74 (5) ◽  
pp. V109-V121 ◽  
Author(s):  
Ehsan Zabihi Naeini ◽  
Henning Hoeber ◽  
Gordon Poole ◽  
Hamid R. Siahkoohi

Time-shift estimation is a key step in seismic time-lapse processing as well as in many other signal-processing applications. We consider the time-shift problem in the setting of multiple repeat surveys that must be aligned consistently. We introduce an optimized least-squares method based on the Taylor expansion for estimating two-vintage time shifts and compare it to crosscorrelation. The superiority of the proposed algorithm is demonstrated with synthetic data and residual time-lapse matching on a U. K. continental shelf data set. We then discuss the shortcomings of cascaded time alignment in multiple repeat monitor surveys and propose an approach to estimate simultaneous multivintage time shifts that uses a constrained least-squares technique combined with elements of network theory. The resulting time shifts are consistent across all vintages in a least-squares sense, improving overall alignment when compared to the classical flow of alignment in a cascaded manner. The method surpasses the cascaded approach, as noted with sample synthetic and three-vintage U. K. continental shelf time-lapse data sets.


Author(s):  
Raul E. Avelar ◽  
Karen Dixon ◽  
Boniphace Kutela ◽  
Sam Klump ◽  
Beth Wemple ◽  
...  

The calibration of safety performance functions (SPFs) is a mechanism included in the Highway Safety Manual (HSM) to adjust SPFs in the HSM for use in intended jurisdictions. Critically, the quality of the calibration procedure must be assessed before using the calibrated SPFs. Multiple resources to aid practitioners in calibrating SPFs have been developed in the years following the publication of the HSM 1st edition. Similarly, the literature suggests multiple ways to assess the goodness-of-fit (GOF) of a calibrated SPF to a data set from a given jurisdiction. This paper uses the calibration results of multiple intersection SPFs to a large Mississippi safety database to examine the relations between multiple GOF metrics. The goal is to develop a sensible single index that leverages the joint information from multiple GOF metrics to assess overall quality of calibration. A factor analysis applied to the calibration results revealed three underlying factors explaining 76% of the variability in the data. From these results, the authors developed an index and performed a sensitivity analysis. The key metrics were found to be, in descending order: the deviation of the cumulative residual (CURE) plot from the 95% confidence area, the mean absolute deviation, the modified R-squared, and the value of the calibration factor. This paper also presents comparisons between the index and alternative scoring strategies, as well as an effort to verify the results using synthetic data. The developed index is recommended to comprehensively assess the quality of the calibrated intersection SPFs.


Water ◽  
2021 ◽  
Vol 13 (1) ◽  
pp. 107
Author(s):  
Elahe Jamalinia ◽  
Faraz S. Tehrani ◽  
Susan C. Steele-Dunne ◽  
Philip J. Vardon

Climatic conditions and vegetation cover influence water flux in a dike, and potentially the dike stability. A comprehensive numerical simulation is computationally too expensive to be used for the near real-time analysis of a dike network. Therefore, this study investigates a random forest (RF) regressor to build a data-driven surrogate for a numerical model to forecast the temporal macro-stability of dikes. To that end, daily inputs and outputs of a ten-year coupled numerical simulation of an idealised dike (2009–2019) are used to create a synthetic data set, comprising features that can be observed from a dike surface, with the calculated factor of safety (FoS) as the target variable. The data set before 2018 is split into training and testing sets to build and train the RF. The predicted FoS is strongly correlated with the numerical FoS for data that belong to the test set (before 2018). However, the trained model shows lower performance for data in the evaluation set (after 2018) if further surface cracking occurs. This proof-of-concept shows that a data-driven surrogate can be used to determine dike stability for conditions similar to the training data, which could be used to identify vulnerable locations in a dike network for further examination.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Ermanno Cordelli ◽  
Paolo Soda ◽  
Giulio Iannello

Abstract Background Biological phenomena usually evolves over time and recent advances in high-throughput microscopy have made possible to collect multiple 3D images over time, generating $$3D+t$$ 3 D + t (or 4D) datasets. To extract useful information there is the need to extract spatial and temporal data on the particles that are in the images, but particle tracking and feature extraction need some kind of assistance. Results This manuscript introduces our new freely downloadable toolbox, the Visual4DTracker. It is a MATLAB package implementing several useful functionalities to navigate, analyse and proof-read the track of each particle detected in any $$3D+t$$ 3 D + t stack. Furthermore, it allows users to proof-read and to evaluate the traces with respect to a given gold standard. The Visual4DTracker toolbox permits the users to visualize and save all the generated results through a user-friendly graphical user interface. This tool has been successfully used in three applicative examples. The first processes synthetic data to show all the software functionalities. The second shows how to process a 4D image stack showing the time-lapse growth of Drosophila cells in an embryo. The third example presents the quantitative analysis of insulin granules in living beta-cells, showing that such particles have two main dynamics that coexist inside the cells. Conclusions Visual4DTracker is a software package for MATLAB to visualize, handle and manually track $$3D+t$$ 3 D + t stacks of microscopy images containing objects such cells, granules, etc.. With its unique set of functions, it remarkably permits the user to analyze and proof-read 4D data in a friendly 3D fashion. The tool is freely available at https://drive.google.com/drive/folders/19AEn0TqP-2B8Z10kOavEAopTUxsKUV73?usp=sharing


Author(s):  
Lorenzo Chicchi ◽  
Gloria Cecchini ◽  
Ihusan Adam ◽  
Giuseppe de Vito ◽  
Roberto Livi ◽  
...  

AbstractAn inverse procedure is developed and tested to recover functional and structural information from global signals of brains activity. The method assumes a leaky-integrate and fire model with excitatory and inhibitory neurons, coupled via a directed network. Neurons are endowed with a heterogenous current value, which sets their associated dynamical regime. By making use of a heterogenous mean-field approximation, the method seeks to reconstructing from global activity patterns the distribution of in-coming degrees, for both excitatory and inhibitory neurons, as well as the distribution of the assigned currents. The proposed inverse scheme is first validated against synthetic data. Then, time-lapse acquisitions of a zebrafish larva recorded with a two-photon light sheet microscope are used as an input to the reconstruction algorithm. A power law distribution of the in-coming connectivity of the excitatory neurons is found. Local degree distributions are also computed by segmenting the whole brain in sub-regions traced from annotated atlas.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


Sign in / Sign up

Export Citation Format

Share Document