Analysis and Application on Frequency-Response Charateristic Based on Multi-Elements in Near-Surface

2014 ◽  
Vol 490-491 ◽  
pp. 1548-1552
Author(s):  
Zhi Xin Di

With the continuously process of prospecting program, our land exploration enter into activity lithostratigrapgy stage. For searching medium and small or subtle reservior, higher seismic data discernibility must be needed. In the explosive source area, surface layer velocity, shot lithology and ghost interface are the three key elements influencing the shot frequency. In view of the trait that the quality of single shot has apparently difference causing by near surface layer Yellow River Delta multiple lithology alternating deposits, we study the characteristics of frequency reponse to lithology and ghost by microseismogram log data, to provide reliable basis for scientific select shooting parameter.

Geophysics ◽  
2004 ◽  
Vol 69 (4) ◽  
pp. 1091-1101 ◽  
Author(s):  
Gabriel Perez ◽  
Ken Larner

Founded on the assumption of surface consistency, reflection‐based residual‐statics correct for the time distortions in seismic data arising from rapid near‐surface variations. The assumption is founded on a vertical‐path, ray‐theoretical model of wave propagation in the near surface. Since ray theory does not hold for models with rapid spatial variation, we resort to finite‐difference modeling to study the influence of wave‐theoretical aspects on the character of time distortions and the validity of the surface‐consistency assumption. For near‐surface models that are admittedly simple and idealized, we find that the character of time distortions is highly influenced by a wavefront‐healing phenomenon whose strength depends on the ratio of the size of the frequency‐dependent Fresnel zone to the wavelength of lateral variation in the model. As experienced in practice, the quality of the surface‐consistency assumption degrades with increasing ratio of spreadlength to reflector depth. The validity of the assumption is best for longer‐wavelength anomalies in a weathering layer that is relatively thin. Wavefront healing, however, limits significantly the time‐distortion problem where the weathering layer is relatively thick. Interestingly, wavefront healing also helps to reduce the size of the time‐distortion problem when the velocity in the surface layer is large relative to that in the layers beneath, such as in areas of permafrost.


2003 ◽  
Vol 22 (7) ◽  
pp. 680-683 ◽  
Author(s):  
Xiang-Yang Li ◽  
Yi-Jie Liu ◽  
Enru Liu ◽  
Feng Shen ◽  
Li Qi ◽  
...  

2002 ◽  
Author(s):  
Xiang‐Yang Li ◽  
Yi‐Jie Liu ◽  
Enru Liu ◽  
Feng Shen ◽  
Li Qi ◽  
...  

2021 ◽  
Author(s):  
Ramy Elasrag ◽  
Thuraya Al Ghafri ◽  
Faaeza Al Katheer ◽  
Yousuf Al-Aufi ◽  
Ivica Mihaljevic ◽  
...  

Abstract Acquiring surface seismic data can be challenging in areas of intense human activities, due to presence of infrastructures (roads, houses, rigs), often leaving large gaps in the fold of coverage that can span over several kilometers. Modern interpolation algorithms can interpolate up to a certain extent, but quality of reconstructed seismic data diminishes as the acquisition gap increases. This is where vintage seismic acquisition can aid processing and imaging, especially if previous acquisition did not face the same surface obstacles. In this paper we will present how the legacy seismic survey has helped to fill in the data gaps of the new acquisition and produced improved seismic image. The new acquisition survey is part of the Mega 3D onshore effort undertaken by ADNOC, characterized by dense shot and receiver spacing with focus on full azimuth and broadband. Due to surface infrastructures, data could not be completely acquired leaving sizable gap in the target area. However, a legacy seismic acquisition undertaken in 2014 had access to such gap zones, as infrastructures were not present at the time. Legacy seismic data has been previously processed and imaged, however simple post-imaging merge would not be adequate as two datasets were processed using different workflows and imaging was done using different velocity models. In order to synchronize the two datasets, we have processed them in parallel. Data matching and merging were done before regularization. It has been regularized to radial geometry using 5D Matching Pursuit with Fourier Interpolation (MPFI). This has provided 12 well sampled azimuth sectors that went through surface consistent processing, multiple attenuation, and residual noise attenuation. Near surface model was built using data-driven image-based static (DIBS) while reflection tomography was used to build the anisotropic velocity model. Imaging was done using Pre-Stack Kirchhoff Depth Migration. Processing legacy survey from the beginning has helped to improve signal to noise ratio which assisted with data merging to not degrade the quality of the end image. Building one near surface model allowed both datasets to match well in time domain. Bringing datasets to the same level was an important condition before matching and merging. Amplitude and phase analysis have shown that both surveys are aligned quite well with minimal difference. Only the portion of the legacy survey that covers the gap was used in the regularization, allowing MPFI to reconstruct missing data. Regularized data went through surface multiple attenuation and further noise attenuation as preconditioning for migration. Final image that is created using both datasets has allowed target to be imaged better.


Geophysics ◽  
1960 ◽  
Vol 25 (1) ◽  
pp. 283-311 ◽  
Author(s):  
R. J. Graebner

The theory relating to many methods—for example, multiple seismometer techniques—which the geophysicist may control to improve record quality is well known. However, its application has not been fully exploited. An example of the reduction of theory to practice in one area characterized by poor records is presented. It comprises a series of analytical tests designed to discover the cause of poor records, to examine the effect of each variable on the signal‐to‐noise ratio, and to evaluate the solutions predicted by theory. The tests showed that the poor record quality was attributable chiefly to relatively strong surface and near‐surface waves propagating outward from the shot. Wave length filtering by means of suitable shot and seismometer patterns, and compositing through data processing methods, greatly improved record quality and permitted magnetic recording of reflected signals over a broad frequency range. The tests established, in the allotted time, that the quality of the data would meet clearly specified standards of performance. Experience has shown that better seismic data can generally be obtained when the design of techniques is based on the special character of the signal and noise determined from simple tests rather than when the design is based on general assumptions.


2013 ◽  
Vol 726-731 ◽  
pp. 4738-4741
Author(s):  
Hai Bo Yang ◽  
Jing Wei Chen ◽  
Yun Fei Li

Some eco-environmental factors are analyzed by RS, and population and GDP spatial distribution are obtained from statistical data. By combining AHP methods for determining weights, comprehensive index method is used to evaluate the eco-environmental quality of Yellow River Delta (YRD) on three scales: raster, administrative region, and delta. Results show that: Eco-environmental quality of YRD lies in moderate level. In three terms (1996, 2000, 2004), it was the best in 1996, and decreased to the worst in 2000, and then increased in 2004. In the delta, Estuary area and Dongying area were poorer than Guangrao county and Liji county.


Sign in / Sign up

Export Citation Format

Share Document