On the seismic wavelet estimative and reflectivity recovering based on linear inversion: Well-to-seismic tie on a real data set from Viking Graben, North Sea

Geophysics ◽  
2020 ◽  
Vol 85 (5) ◽  
pp. D157-D165
Author(s):  
Isadora A. S. de Macedo ◽  
José Jadsom S. de Figueiredo

Tying seismic data to well data is critical in reservoir characterization. In general, the main factors controlling a successful seismic well tie are an accurate time-depth relationship and a coherent wavelet estimate. Wavelet estimation methods are divided into two major groups: statistical and deterministic. Deterministic methods are based on using the seismic trace and the well data to estimate the wavelet. Statistical methods use only the seismic trace and generally require assumptions about the wavelet’s phase or a random process reflectivity series. We have compared the estimation of the wavelet for seismic well tie purposes through least-squares minimization and zero-order quadratic regularization with the results obtained from homomorphic deconvolution. Both methods make no assumption regarding the wavelet’s phase or the reflectivity. The best-estimated wavelet is used as the input to sparse-spike deconvolution to recover the reflectivity near the well location. The results show that the wavelets estimated from both deconvolutions are similar, which builds our confidence in their accuracy. The reflectivity of the seismic section is recovered according to known stratigraphic markers (from gamma-ray logs) present in the real data set from the Viking Graben field, Norway.

Geophysics ◽  
1998 ◽  
Vol 63 (6) ◽  
pp. 2035-2041 ◽  
Author(s):  
Zhengping Liu ◽  
Jiaqi Liu

We present a data‐driven method of joint inversion of well‐log and seismic data, based on the power of adaptive mapping of artificial neural networks (ANNs). We use the ANN technique to find and approximate the inversion operator guided by the data set consisting of well data and seismic recordings near the wells. Then we directly map seismic recordings to well parameters, trace by trace, to extrapolate the wide‐band profiles of these parameters using the approximation operator. Compared to traditional inversions, which are based on a few prior theoretical operators, our inversion is novel because (1) it inverts for multiple parameters and (2) it is nonlinear with a high degree of complexity. We first test our algorithm with synthetic data and analyze its sensitivity and robustness. We then invert real data to obtain two extrapolation profiles of sonic log (DT) and shale content (SH), the latter a unique parameter of the inversion and significant for the detailed evaluation of stratigraphic traps. The high‐frequency components of the two profiles are significantly richer than those of the original seismic section.


Author(s):  
Parisa Torkaman

The generalized inverted exponential distribution is introduced as a lifetime model with good statistical properties. This paper, the estimation of the probability density function and the cumulative distribution function of with five different estimation methods: uniformly minimum variance unbiased(UMVU), maximum likelihood(ML), least squares(LS), weighted least squares (WLS) and percentile(PC) estimators are considered. The performance of these estimation procedures, based on the mean squared error (MSE) by numerical simulations are compared. Simulation studies express that the UMVU estimator performs better than others and when the sample size is large enough the ML and UMVU estimators are almost equivalent and efficient than LS, WLS and PC. Finally, the result using a real data set are analyzed.


In this paper, we have defined a new two-parameter new Lindley half Cauchy (NLHC) distribution using Lindley-G family of distribution which accommodates increasing, decreasing and a variety of monotone failure rates. The statistical properties of the proposed distribution such as probability density function, cumulative distribution function, quantile, the measure of skewness and kurtosis are presented. We have briefly described the three well-known estimation methods namely maximum likelihood estimators (MLE), least-square (LSE) and Cramer-Von-Mises (CVM) methods. All the computations are performed in R software. By using the maximum likelihood method, we have constructed the asymptotic confidence interval for the model parameters. We verify empirically the potentiality of the new distribution in modeling a real data set.


Author(s):  
Arun Kumar Chaudhary ◽  
Vijay Kumar

In this study, we have introduced a three-parameter probabilistic model established from type I half logistic-Generating family called half logistic modified exponential distribution. The mathematical and statistical properties of this distribution are also explored. The behavior of probability density, hazard rate, and quantile functions are investigated. The model parameters are estimated using the three well known estimation methods namely maximum likelihood estimation (MLE), least-square estimation (LSE) and Cramer-Von-Mises estimation (CVME) methods. Further, we have taken a real data set and verified that the presented model is quite useful and more flexible for dealing with a real data set. KEYWORDS— Half-logistic distribution, Estimation, CVME ,LSE, , MLE


2020 ◽  
Vol 39 (5) ◽  
pp. 346-352
Author(s):  
Mohamed G. El-Behiry ◽  
Mohamed S. Al Araby ◽  
Ramy Z. Ragab

Seismic wavelets are dynamic components that result in a seismic trace when convolved with reflectivity series. The seismic wavelet is described by three components: amplitude, frequency, and phase. Amplitude and frequency are considered static because they mainly affect the appearance of a seismic event. Phase can have a large effect on seismic appearance by changing the way it describes the subsurface. Knowing the wavelet properties of certain seismic data facilitates the process of interpretation by providing an understanding of the appearance of regional geologic markers and hydrocarbon-bearing formation behavior. The process through which seismic data wavelets are understood is called seismic well tie. Seismic well tie is the first step in calibrating seismic data in terms of polarity and phase. It ensures that the seismic data are descriptive to regional markers, well markers, and discoveries (if they exist). The step connects well data to seismic data to ensure that the seismic correctly describes well results at the well location. It then extends the understanding of seismic behavior to the rest of the area covered by the seismic data. Good seismic well tie will greatly reduce uncertainties accompanying seismic interpretation. One important outcome of the seismic well tie process is understanding the phase of seismic data, which affects how seismic data will reflect a known geologic marker or hydrocarbon-bearing zone. This understanding can be useful in quantifying discoveries attached to seismic anomalies and extending knowledge from the well location to the rest of the area covered by seismic data.


2006 ◽  
Vol 36 (02) ◽  
pp. 463-487
Author(s):  
David P.M. Scollnik

This paper investigates some models in which non-negative observations from a Poisson or generalised Poisson distribution are possibly damaged according to a binomial or quasi-binomial law. The latter case is appropriate when the observations are over-dispersed. Although the extent of the damage is not known, it is assumed that the event of whether or not damage occurred is discernible. The models are particularly suited for certain applications involving accident counts when evidence of certain accidents may be observed even though the accidents themselves may go unreported. Given the number of observed accidents and knowledge as to whether or not some additional accidents have gone unreported, these models may be used to make inferences concerning the actual number of unreported and total number of accidents in the current period, and the numbers of reported, unreported, and/or total accidents in a future period. The models are applied to a real data set giving reported and unreported patient accidents in a large hospital. Both maximum likelihood and Bayesian estimation methods are presented and discussed.


2020 ◽  
Vol 17 (1) ◽  
pp. 28-32
Author(s):  
A. Ogbamikhumi ◽  
S.A. Salami ◽  
W.N. Uwadiae

This study present a new technique that integrates several logs for P-wave prediction to minimize some errors and uncertainties associated with most estimation methods. The adopted method involves application of an artificial neural network technique that integrates density, resistivity and gamma ray logs for data training and the prediction of P-wave log. The results obtained gave correlation coefficient of 0.77, 0.24 and 0.42 between the acquired P-wave log and the acquired density, resistivity and gamma ray logs respectively, to demonstrate the relationship between P-wave log and the selected logs for the prediction process. The correlation coefficient of the estimated P-wave from Gardner and Faust methods with the acquired P-wave log are 0.64 and 0.59 respectively, while that of the neural network derived P-wave gave a better correlation coefficient of 0.81. Cross plot validation of P-wave derive Acoustic Impedance against density for both lithology and fluid discrimination revealed clusters for neural network derived P-wave parameter similar to the acquired P-wave derived parameters. Results of the presented neural network technique have been demonstrated to be more effective than results of the two conventional techniques. Keywords: Sonic log, Gardner’s method, Faust method, Neural network, Cross plot.


2021 ◽  
Vol 2 ◽  
pp. 1
Author(s):  
Haitham M. Yousof ◽  
Mustafa C. Korkmaz ◽  
G.G. Hamedani ◽  
Mohamed Ibrahim

In this work, we derive a novel extension of Chen distribution. Some statistical properties of the new model are derived. Numerical analysis for mean, variance, skewness and kurtosis is presented. Some characterizations of the proposed distribution are presented. Different classical estimation methods under uncensored schemes such as the maximum likelihood, Anderson-Darling, weighted least squares and right-tail Anderson–Darling methods are considered. Simulation studies are performed in order to compare and assess the above-mentioned estimation methods. For comparing the applicability of the four classical methods, two application to real data set are analyzed.


Geophysics ◽  
2018 ◽  
Vol 83 (3) ◽  
pp. B105-B117 ◽  
Author(s):  
Julien Cotton ◽  
Hervé Chauris ◽  
Eric Forgues ◽  
Paul Hardouin

In 4D seismic, the velocity model used for imaging and reservoir characterization can change as production from the reservoir progresses. This is particularly true for heavy oil reservoirs stimulated by steam injection. In the context of sparse and low-fold seismic acquisitions, conventional migration velocity analyses can be inadequate because of a poorly and irregularly sampled offset dimension. We update the velocity model in the context of daily acquisitions with buried sources and receivers. The main objective is to demonstrate that subtle time-lapse effects can be detected over the calendar time on onshore sparse acquisitions. We develop a modified version of the conventional prestack time migration to detect velocity changes obtained after crosscorrelation of the base and monitor surveys. This technique is applied on a heavy oil real data set from the Netherlands and reveals how the steam diffuses over time within the reservoir.


2020 ◽  
Vol 8 (1) ◽  
pp. T43-T53
Author(s):  
Isadora A. S. de Macedo ◽  
Jose Jadsom S. de Figueiredo ◽  
Matias C. de Sousa

Reservoir characterization requires accurate elastic logs. It is necessary to guarantee that the logging tool is stable during the drilling process to avoid compromising the measurements of the physical properties in the formation in the vicinity of the well. Irregularities along the borehole may happen, especially if the drilling device is passing through unconsolidated formations. This affects the signals recorded by the logging tool, and the measurements may be more impacted by the drilling mud than by the formation. The caliper log indicates the change in the diameter of the borehole with depth and can be used as an indicator of the quality of other logs whose data have been degraded by the enlargement or shrinkage of the borehole wall. Damaged well-log data, particularly density and velocity profiles, affect the quality and accuracy of the well-to-seismic tie. To investigate the effects of borehole enlargement on the well-to-seismic tie, an analysis of density log correction was performed. This approach uses Doll’s geometric factor to correct the density log for wellbore enlargement using the caliper readings. Because the wavelet is an important factor on the well tie, we tested our methodology with statistical and deterministic wavelet estimations. For both cases, the results using the real data set from the Viking Graben field — North Sea indicated up to a 7% improvement on the correlation between the real and synthetic seismic traces for well-to-seismic tie when the density correction was made.


Sign in / Sign up

Export Citation Format

Share Document