Determination of background velocities by multiple migration fitting

Geophysics ◽  
1995 ◽  
Vol 60 (2) ◽  
pp. 476-490 ◽  
Author(s):  
Guy Chavent ◽  
Chester A. Jacewitz

We present an approach called multiple migration fitting (MMF) designed to automatically determine 2-D background velocities from prestack seismic data. In this approach, we maximize a scalar similarity index (SI) for a collection of migrated sections obtained by various illuminations of the same earth. Numerical investigation shows that this index is a rather smooth, nonoscillatory function of velocity that tends to be a maximum for good velocity profiles, and hence is amenable to maximization by local gradient techniques. This maximization will be practically feasible, as we prove that the exact gradient of SI can be computed at an additional cost of only twice that required for the computation of the collection of migrated sections, independently of the number of velocity unknowns. Application to synthetic data shows that MMF leads to enhanced background velocities and stacked migrated sections.

Geophysics ◽  
1994 ◽  
Vol 59 (6) ◽  
pp. 963-972 ◽  
Author(s):  
Bastian Blonk ◽  
Gérard C. Herman

A method is presented for eliminating near‐surface scattered noise from seismic data. Starting from an appropriately chosen background model, a surface‐consistent scattering model is determined using linearized elastodynamic inverse scattering theory. This scattering model does not necessarily equal the actual scatterer distribution, but it enables one to calculate, approximately, the near‐surface scattered part of the data. The method honors at least some of the complexity of the near‐surface scattering process and can be applied in cases where traditional methods, like wavenumber‐frequency filtering techniques and methods for static corrections, are ineffective. From a number of tests on synthetic data, we conclude that the method is rather robust; its main sensitivity is because of errors in the determination of the background Rayleigh‐wave velocity.


Author(s):  
P.L. Nikolaev

This article deals with method of binary classification of images with small text on them Classification is based on the fact that the text can have 2 directions – it can be positioned horizontally and read from left to right or it can be turned 180 degrees so the image must be rotated to read the sign. This type of text can be found on the covers of a variety of books, so in case of recognizing the covers, it is necessary first to determine the direction of the text before we will directly recognize it. The article suggests the development of a deep neural network for determination of the text position in the context of book covers recognizing. The results of training and testing of a convolutional neural network on synthetic data as well as the examples of the network functioning on the real data are presented.


Geophysics ◽  
1973 ◽  
Vol 38 (2) ◽  
pp. 310-326 ◽  
Author(s):  
R. J. Wang ◽  
S. Treitel

The normal equations for the discrete Wiener filter are conventionally solved with Levinson’s algorithm. The resultant solutions are exact except for numerical roundoff. In many instances, approximate rather than exact solutions satisfy seismologists’ requirements. The so‐called “gradient” or “steepest descent” iteration techniques can be used to produce approximate filters at computing speeds significantly higher than those achievable with Levinson’s method. Moreover, gradient schemes are well suited for implementation on a digital computer provided with a floating‐point array processor (i.e., a high‐speed peripheral device designed to carry out a specific set of multiply‐and‐add operations). Levinson’s method (1947) cannot be programmed efficiently for such special‐purpose hardware, and this consideration renders the use of gradient schemes even more attractive. It is, of course, advisable to utilize a gradient algorithm which generally provides rapid convergence to the true solution. The “conjugate‐gradient” method of Hestenes (1956) is one of a family of algorithms having this property. Experimental calculations performed with real seismic data indicate that adequate filter approximations are obtainable at a fraction of the computer cost required for use of Levinson’s algorithm.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


Geophysics ◽  
2016 ◽  
Vol 81 (6) ◽  
pp. A17-A21 ◽  
Author(s):  
Juan I. Sabbione ◽  
Mauricio D. Sacchi

The coefficients that synthesize seismic data via the hyperbolic Radon transform (HRT) are estimated by solving a linear-inverse problem. In the classical HRT, the computational cost of the inverse problem is proportional to the size of the data and the number of Radon coefficients. We have developed a strategy that significantly speeds up the implementation of time-domain HRTs. For this purpose, we have defined a restricted model space of coefficients applying hard thresholding to an initial low-resolution Radon gather. Then, an iterative solver that operated on the restricted model space was used to estimate the group of coefficients that synthesized the data. The method is illustrated with synthetic data and tested with a marine data example.


1991 ◽  
Vol 113 (3) ◽  
pp. 206-210 ◽  
Author(s):  
D. Yogi Goswami

This paper analyzes velocity profiles for flow through circular tubes in laminar, turbulent, and transition region flows and how they affect measurement by flow-meters. Experimental measurements of velocity profiles across the cross-section of straight circular tubes were made using laser doppler velocimetry. In addition, flow visualization was done using the hydrogen bubble technique. Velocity profiles in the laminar and the turbulent flow are quite predictable which allow the determination of meter factors for accurate flow measurement. However, the profiles can not be predicted at all in the transition region. Therefore, for the accuracy of the flowmeter, it must be ensured that the flow is completely in the laminar regime or completely in the turbulent regime. In the laminar flow a bend, even at a large distance, affects the meter factor. The paper also discusses some strategies to restructure the flow to avoid the transition region.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. C81-C92 ◽  
Author(s):  
Helene Hafslund Veire ◽  
Hilde Grude Borgos ◽  
Martin Landrø

Effects of pressure and fluid saturation can have the same degree of impact on seismic amplitudes and differential traveltimes in the reservoir interval; thus, they are often inseparable by analysis of a single stacked seismic data set. In such cases, time-lapse AVO analysis offers an opportunity to discriminate between the two effects. We quantify the uncertainty in estimations to utilize information about pressure- and saturation-related changes in reservoir modeling and simulation. One way of analyzing uncertainties is to formulate the problem in a Bayesian framework. Here, the solution of the problem will be represented by a probability density function (PDF), providing estimations of uncertainties as well as direct estimations of the properties. A stochastic model for estimation of pressure and saturation changes from time-lapse seismic AVO data is investigated within a Bayesian framework. Well-known rock physical relationships are used to set up a prior stochastic model. PP reflection coefficient differences are used to establish a likelihood model for linking reservoir variables and time-lapse seismic data. The methodology incorporates correlation between different variables of the model as well as spatial dependencies for each of the variables. In addition, information about possible bottlenecks causing large uncertainties in the estimations can be identified through sensitivity analysis of the system. The method has been tested on 1D synthetic data and on field time-lapse seismic AVO data from the Gullfaks Field in the North Sea.


Geophysics ◽  
2006 ◽  
Vol 71 (3) ◽  
pp. V79-V86 ◽  
Author(s):  
Hakan Karsli ◽  
Derman Dondurur ◽  
Günay Çifçi

Time-dependent amplitude and phase information of stacked seismic data are processed independently using complex trace analysis in order to facilitate interpretation by improving resolution and decreasing random noise. We represent seismic traces using their envelopes and instantaneous phases obtained by the Hilbert transform. The proposed method reduces the amplitudes of the low-frequency components of the envelope, while preserving the phase information. Several tests are performed in order to investigate the behavior of the present method for resolution improvement and noise suppression. Applications on both 1D and 2D synthetic data show that the method is capable of reducing the amplitudes and temporal widths of the side lobes of the input wavelets, and hence, the spectral bandwidth of the input seismic data is enhanced, resulting in an improvement in the signal-to-noise ratio. The bright-spot anomalies observed on the stacked sections become clearer because the output seismic traces have a simplified appearance allowing an easier data interpretation. We recommend applying this simple signal processing for signal enhancement prior to interpretation, especially for single channel and low-fold seismic data.


Sign in / Sign up

Export Citation Format

Share Document