Horizon-based semiautomated nonhyperbolic velocity analysis

Geophysics ◽  
2014 ◽  
Vol 79 (6) ◽  
pp. U15-U23 ◽  
Author(s):  
Bo Zhang ◽  
Tao Zhao ◽  
Jie Qi ◽  
Kurt J. Marfurt

With higher capacity recording systems, long-offset surveys are becoming common in seismic exploration plays. Long offsets provide leverage against multiples, have greater sensitivity to anisotropy, and are key to accurate inversion for shear impedance and density. There are two main issues associated with preserving the data fidelity contained in the large offsets: (1) nonhyperbolic velocity analysis and (2) mitigating the migration/NMO stretch. Current nonhyperbolic velocity analysis workflows first estimate moveout velocity [Formula: see text] based on the offset-limited gathers, then pick an effective anellipticity [Formula: see text] using the full-offset gathers. Unfortunately, estimating [Formula: see text] at small aperture may be inaccurate, with picking errors in [Formula: see text] introducing errors in the subsequent analysis of effective anellipticity. We have developed an automated algorithm to simultaneously estimate the nonhyperbolic parameters. Instead of directly seeking an effective stacking model, the algorithm finds an interval model that gives the most powerful stack. The searching procedure for the best interval model was conducted using a direct, global optimization algorithm called differential evolutionary. Next, we applied an antistretch workflow to minimize stretch at a far offset after obtaining the optimal effective model. The automated velocity analysis and antistretch workflow were tested on the data volume acquired over the Fort Worth Basin, USA. The results provided noticeable improvement on the prestack gathers and on the stacked data volume.

Geophysics ◽  
2013 ◽  
Vol 78 (1) ◽  
pp. U9-U18 ◽  
Author(s):  
Bo Zhang ◽  
Kui Zhang ◽  
Shiguang Guo ◽  
Kurt J. Marfurt

Wide-azimuth, long-offset surveys are becoming increasingly common in unconventional exploration plays where one of the key routine processes is maintaining data fidelity at far offsets. The conventional NMO correction that processes the data sample-by-sample results in the well-known decrease of frequency content and amplitude distortion through stretch, which lowers the seismic resolution and hinders [Formula: see text] and amplitude variation with offset and azimuth (AVAz) analysis of the long-offset signal. To mitigate the stretch typically associated with large offsets, we use a matching-pursuit-based normal moveout correction (MPNMO) to reduce NMO-stretch effects in long-offset data. MPNMO corrects the data wavelet-by-wavelet rather than sample-by-sample, thereby avoiding stretch. We apply our technique (1) to a set of synthetic gathers and (2) as part of a residual velocity analysis workflow to a prestack time-migrated data volume acquired over the Northern Chicontepec Basin, Mexico. Test results show that MPNMO can produce relatively nonstretched events and generate higher temporal resolution prestack gathers.


Geophysics ◽  
2013 ◽  
Vol 78 (5) ◽  
pp. U53-U63 ◽  
Author(s):  
Andrea Tognarelli ◽  
Eusebio Stucchi ◽  
Alessia Ravasio ◽  
Alfredo Mazzotti

We tested the properties of three different coherency functionals for the velocity analysis of seismic data relative to subbasalt exploration. We evaluated the performance of the standard semblance algorithm and two high-resolution coherency functionals based on the use of analytic signals and of the covariance estimation along hyperbolic traveltime trajectories. Approximate knowledge of the wavelet was exploited to design appropriate filters that matched the primary reflections, thereby further improving the ability of the functionals to highlight the events of interest. The tests were carried out on two synthetic seismograms computed on models reproducing the geologic setting of basaltic intrusions and on common midpoint gathers from a 3D survey. Synthetic and field data had a very low signal-to-noise ratio, strong multiple contamination, and weak primary subbasalt signals. The results revealed that high-resolution coherency functionals were more suitable than semblance algorithms to detect primary signals and to distinguish them from multiples and other interfering events. This early discrimination between primaries and multiples could help to target specific signal enhancement and demultiple operations.


2011 ◽  
Author(s):  
F. Martin ◽  
M. Almutairi ◽  
S. Fernández

2016 ◽  
Vol 4 (2) ◽  
pp. SG1-SG9 ◽  
Author(s):  
Marcus P. Cahoj ◽  
Sumit Verma ◽  
Bryce Hutchinson ◽  
Kurt J. Marfurt

The term acquisition footprint is commonly used to define patterns in seismic time and horizon slices that are closely correlated to the acquisition geometry. Seismic attributes often exacerbate footprint artifacts and may pose pitfalls to the less experienced interpreter. Although removal of the acquisition footprint is the focus of considerable research, the sources of such artifact acquisition footprint are less commonly discussed or illustrated. Based on real data examples, we have hypothesized possible causes of footprint occurrence and created them through synthetic prestack modeling. Then, we processed these models using the same workflows used for the real data. Computation of geometric attributes from the migrated synthetics found the same footprint artifacts as the real data. These models showed that acquisition footprint could be caused by residual ground roll, inaccurate velocities, and far-offset migration stretch. With this understanding, we have examined the real seismic data volume and found that the key cause of acquisition footprint was inaccurate velocity analysis.


2011 ◽  
Author(s):  
Bo Zhang ◽  
Tang Wang ◽  
Kurt J. Marfurt

2018 ◽  
Vol 6 (2) ◽  
pp. T349-T365 ◽  
Author(s):  
Xuan Qi ◽  
Kurt Marfurt

One of the key tasks of a seismic interpreter is to map lateral changes in surfaces, not only including faults, folds, and flexures, but also incisements, diapirism, and dissolution features. Volumetrically, coherence provides rapid visualization of faults and curvature provides rapid visualization of folds and flexures. Aberrancy measures the lateral change (or gradient) of curvature along a picked or inferred surface. Aberrancy complements curvature and coherence. In normally faulted terrains, the aberrancy anomaly will track the coherence anomaly and fall between the most positive curvature anomaly defining the footwall and the most negative curvature anomaly defining the hanging wall. Aberrancy can delineate faults whose throw falls below the seismic resolution or is distributed across a suite of smaller conjugate faults that do not exhibit a coherence anomaly. Previously limited to horizon computations, we extend aberrancy to uninterpreted seismic data volumes. We apply our volumetric aberrancy calculation to a data volume acquired over the Barnett Shale gas reservoir of the Fort Worth Basin, Texas. In this area, the Barnett Shale is bound on the top by the Marble Falls Limestone and on the bottom by the Ellenburger Dolomite. Basement faulting controls karstification in the Ellenburger, resulting in the well-known “string of pearls” pattern seen on coherence images. Aberrancy delineates small karst features, which are, in many places, too smoothly varying to be detected by coherence. Equally important, aberrancy provides the azimuthal orientation of the fault and flexure anomalies.


Author(s):  
Aiello Gemma

The techniques of seismic surveying, especially reflection seismic, considerably varied during last year’s. The contribution to this variation mainly came from the oil industry, which has developed the geophysical methods for oil searching. The basic techniques of seismic exploration consist of the generation of seismic waves artificially in the ground (source) and of the measurement of the requested times to cover the source-receiver path. Seismic data processing of three multichannel seismic profiles located in the Gulf of Naples for an overall length of 150 kilometers is herein presented. The techniques of seismic processing used for the elaboration of the seismic data are up-to-date. Some of them are based on complex mathematical models, allowing obtaining good velocity analysis for the production of stacked sections, ready to be interpreted. In this paper the procedures of processing of multichannel seismic data starting from the field data are shown. Sketch diagrams of the elaboration processes applied during several phases of the whole processing have been constructed. The used software are the Promax2D (Landmark Ltd.) and the Seismic Unix (Colorado School of Mines). The steps of the seismic data processes included the pre-processing, the sorting, the velocity analysis, the normal move-out (NMO), the stacking, the band-pass filtering, the multiple removals, the predictive de-convolution and the spiking de-convolution.


Sign in / Sign up

Export Citation Format

Share Document