PC‐based acquisition and processing of high‐resolution marine seismic data

Geophysics ◽  
1996 ◽  
Vol 61 (6) ◽  
pp. 1804-1812 ◽  
Author(s):  
Ho‐Young Lee ◽  
Byung‐Koo Hyun ◽  
Young‐Sae Kong

We have improved the quality of high‐resolution marine seismic data using a simple PC‐based acquisition and processing system. The system consists of a PC, an A/D converter, and a magneto‐optical disk drive. The system has been designed to acquire single‐channel data at up to 60,000 samples per second and to perform data processing of seismic data by a simple procedure. Test surveys have been carried out off Pohang, southern East Sea of Korea. The seismic systems used for the test were an air gun and a 3.5 kHz sub‐bottom profiling system. Spectral characteristics of the sources were analyzed. Simple digital signal processes which include gain recovery, deconvolution, band‐pass filter, and swell filter were performed. The quality of seismic sections produced by the system is greatly enhanced in comparison to analog sections. The PC‐based system for acquisition and processing of high‐resolution marine seismic data is economical and versatile.

Geophysics ◽  
1985 ◽  
Vol 50 (2) ◽  
pp. 257-261 ◽  
Author(s):  
M. H. Safar

An important recent development in marine seismic data acquisition is the introduction of the Gemini technique (Newman, 1983, Haskey et al., 1983). The technique involves the use of a single Sodera water gun as a reference source together with the conventional air gun or water gun array which is fired a second or two after firing the reference source. The near‐field pressure signature radiated by the reference source is monitored continuously. The main advantage of the Gemini technique is that a shallow high;resolution section is recorded simultaneously with that obtained from the main array.


1971 ◽  
Vol 11 (1) ◽  
pp. 95
Author(s):  
Al Sabitay

The offshore search for oil and gas is progressively moving further out to sea as near-shore structures are delineated and drilled. Prospects that overlap the edge of the continental shelf and slope will more than likely present problems in the processing of marine seismic data because of large and rapid variation in water depth.Magellan Petroleum encountered such difficulties in the digital computer processing of its East Gippsland Basin Prospect which is located some 50 miles southeast of the Victoria coastline.A series of problems developed when an integrated computer program sequence or "package" was applied to the data. It was found that first break suppression schedules, deconvolution design gates, band-pass filter application gates and velocity functions could not be changed often enough due to program restrictions.Where the water bottom topography was rough, the restriction of submitting only three or four water depths to vary the velocity function and subsequent calculation of normal move-out corrections resulted in questionable accuracy for the corrected results.Sometimes, water bottom variations required individual trace static corrections which were not available in this particular "package" processing.Water bottom multiple periods vary as rapidly as the surface that generates them. A meticulous selection of the parameters of deconvolution programs is necessary to attenuate multiples under such conditions. Also close examination of the purposes and consequently methods of deconvolution computer programs is necessary to maximize the effectiveness of this powerful processing tool.Diffractions are frequently generated at points on an irregular sea bottom surface. Such detractions mask true water bottom reflections in deeper water and thus decrease the geophysicist's ability to process data accurately where computer programs require true water bottom depth.Presentation of record sections which illustrate problems and their probable solutions comprise a major part of this paper.


Geophysics ◽  
2009 ◽  
Vol 74 (1) ◽  
pp. V17-V24 ◽  
Author(s):  
Yang Liu ◽  
Cai Liu ◽  
Dian Wang

Random noise in seismic data affects the signal-to-noise ratio, obscures details, and complicates identification of useful information. We have developed a new method for reducing random, spike-like noise in seismic data. The method is based on a 1D stationary median filter (MF) — the 1D time-varying median filter (TVMF). We design a threshold value that controls the filter window according to characteristics of signal and random, spike-like noise. In view of the relationship between seismic data and the threshold value, we chose median filters with different time-varying filter windows to eliminate random, spike-like noise. When comparing our method with other common methods, e.g., the band-pass filter and stationary MF, we found that the TVMF strikes a balance between eliminating random noise and protecting useful information. We tested the feasibility of our method in reducing seismic random, spike-like noise, on a synthetic dataset. Results of applying the method to seismic land data from Texas demonstrated that the TVMF method is effective in practice.


Geophysics ◽  
2013 ◽  
Vol 78 (5) ◽  
pp. W31-W44 ◽  
Author(s):  
Anton Ziolkowski

I consider the problem of finding the impulse response, or Green’s function, from a measured response including noise, given an estimate of the source time function. This process is usually known as signature deconvolution. Classical signature deconvolution provides no measure of the quality of the result and does not separate signal from noise. Recovery of the earth impulse response is here formulated as the calculation of a Wiener filter in which the estimated source signature is the input and the measured response is the desired output. Convolution of this filter with the estimated source signature is the part of the measured response that is correlated with the estimated signature. Subtraction of the correlated part from the measured response yields the estimated noise, or the uncorrelated part. The fraction of energy not contained in this uncorrelated component is defined as the quality of the filter. If the estimated source signature contains errors, the estimated earth impulse response is incomplete, and the estimated noise contains signal, recognizable as trace-to-trace correlation. The method can be applied to many types of geophysical data, including earthquake seismic data, exploration seismic data, and controlled source electromagnetic data; it is illustrated here with examples of marine seismic and marine transient electromagnetic data.


Geophysics ◽  
1982 ◽  
Vol 47 (9) ◽  
pp. 1273-1284 ◽  
Author(s):  
Ken Larner ◽  
Dave Hale ◽  
Sharon Misener Zinkham ◽  
Charles Hewlitt

Marine seismic data are generally contaminated with both “bubble pulses” and “tow noise.” Air gun sources are deployed in arrays designed to reduce the effective level of the bubble pulses. Because the signal from a source array is profoundly altered by the filter characteristics of the earth and because the received signal is subjected to noise‐generating computer processes such as deconvolution, array designs should be optimized to obtain the minimum aggregate noise, and hence the greatest reflection stand‐out, in output traces. For a fixed air‐compressor capacity, a trade‐off in array design exists between maximizing source strength and the fine tuning required to maximize the first‐pulse‐to‐bubble ratio. Except for shallow, high‐resolution surveys where the deconvolution step can be bypassed, optimum suppression of total noise in the output can often be obtained using the available air capacity to increase the source strength of a moderately tuned array, rather than to achieve fine tuning of the array. Processing noise produced by deconvolution will prevent detection of a weak reflection closely following a strong one if the ratio of the two is more than about 21 dB, no matter how finely tuned the source array may be.


Geophysics ◽  
1998 ◽  
Vol 63 (3) ◽  
pp. 1036-1040 ◽  
Author(s):  
Eddy C. Luhurbudi ◽  
Jay Pulliam ◽  
James A. Austin ◽  
Steffen Saustrup ◽  
Paul L. Stoffa

An ultra‐high‐resolution 3-D, single‐channel seismic survey was performed off the coast of New Jersey in 1993 to study the late Quaternary history of sedimentation on the northwest Atlantic continental margin (see Davies et al., 1992) as a part of the Office of Naval Research STRATAFORM initiative (Nittrouer and Kravitz, 1995). Three different sets of profiles were acquired (Figure 1), but only the set with highest spatial density is discussed here. A single ten‐element receiver recorded 300 ms of data for every shot during the survey, which covers a total area of 0.6 km (north‐south) × 7.75 km (east‐west) (see Table 1). The deep‐towed Huntec™ source (deployed at ∼30 m depth) produced frequencies of 500 to 3500 Hz; a band‐pass filter with corner frequencies at 1000 and 3500 Hz was applied during preprocessing.


2020 ◽  
Author(s):  
Wei-Chung Hsiao ◽  
Yi-Ching Yeh ◽  
Yen-Yu Cho ◽  
Shu-Kun Hsu

<p>The Kaoping submarine canyon (KPSC) originates from Kaoping River, southwestern Taiwan that extends about 250 kilometers long from the Kaoping River mouth down to the Manila Trench. It can be divided into three major sections: upper reach (meandering), middle reach (NW-SE trending and V-shaped canyon) and lower reach (meandering). Based on recent a swath bathymetric data in the uppermost KPSC, an obvious seafloor depression can be observed in the eastern bank of the canyon. The eastern bank of the canyon reveals about 30-50 meters in average lower than western bank. The mechanism is blurred. In this study, to investigate fine sedimentary structures in 3D point of view, we used marine sparker seismic method. The seismic source frequency varies from 100 to 1200 Hz which can provide about 0.6 meters vertical resolution (i.e. central frequency 600 Hz and 1,600 m/s Vp). We have collected 75 in-lines across the canyon and 3 cross-lines perpendicular to the in-line. The data went through conventional marine seismic data processing procedures such as bad trace kill, band-pass filter, 2D geometry settings, NMO stacking, swell correction, match filter and predictive deconvolution. The 2D dataset was reformatted by applying 3D geometry settings to create a 3D seismic cube. The result shows that a wide incision channel was first found in the north of Xiaoliuchiu islet. Through depth, this channel becomes two narrower channels divided by a mud diapir. This down cutting can be traced down to transgressive sequence in prior to LGM (Last Glacial Maximum). In addition, a deep-towed sub-bottom profiler shows an obvious down-lapping structures heading off canyon that indicates over banking flow may be a key role to cause this erosional event.</p>


Sign in / Sign up

Export Citation Format

Share Document