coherent noise
Recently Published Documents


TOTAL DOCUMENTS

375
(FIVE YEARS 74)

H-INDEX

30
(FIVE YEARS 4)

2021 ◽  
pp. 4439-4452
Author(s):  
Noor H. Resham ◽  
Heba Kh. Abbas ◽  
Haidar J. Mohamad ◽  
Anwar H. Al-Saleh

    Ultrasound imaging has some problems with image properties output. These affects the specialist decision. Ultrasound noise type is the speckle noise which has a grainy pattern depending on the signal. There are two parts of this study. The first part is the enhancing of images with adaptive Weiner, Lee, Gamma and Frost filters with 3x3, 5x5, and 7x7 sliding windows. The evaluated process was achieved using signal to noise ratio (SNR), peak signal to noise ratio (PSNR), mean square error (MSE), and maximum difference (MD) criteria. The second part consists of simulating noise in a standard image (Lina image) by adding different percentage of speckle noise from 0.01 to 0.06. The supervised classification based minimum distance method is used to evaluate the results depending on selecting four blocks located at different places on the image. Speckle noise was added with different percentage from 0.01 to 0.06 to calculate the coherent noise within the image. The coherent noise was concluded from the slope of the standard deviation with the mean for each noise. The results showed that the additive noise increased with the slide window size, while multiplicative noise did not change with the sliding window nor with increasing noise ratio. Wiener filter has the best results in enhancing the noise.


2021 ◽  
Vol 40 (12) ◽  
pp. 905-913
Author(s):  
Riaz Alai ◽  
Faqi Liu ◽  
Eric Verschuur ◽  
Jan Thorbecke ◽  
Gundogan Coskun ◽  
...  

In our case studies, the success of subsalt exploration and development wells depended heavily on the characterization of highly heterogeneous lacustrine microbial carbonates. Acoustic and elastic inversions have proved to be a good proxy for identification of reservoir quality variation for exploration and development well placements. However, qualitative and quantitative usage of subsalt seismic amplitudes requires proper illumination and good signal-to-noise ratio. If properly imaged, mode-converted reflections and interbed multiples can be complementary to the P-wave image. But, in conventional P-wave-oriented imaging, both types of events cannot be imaged correctly. They appear as coherent noise and negatively impact the overall exploration and development project outcomes, especially in areas with poor illumination. This paper consists of two parts: first, we investigate the potential problems resulting from converted waves and interbed multiples in data from two different basins — the Gulf of Mexico and the Campos Basin in offshore Brazil — and show our approach to attenuate them to reveal the true structures. The second part focuses on advanced identification of interbed multiples in modeling and migration methods. To facilitate the various strategies to attenuate interbed multiples, “interpretation” of the various events plays a significant role. Vertical seismic profile (VSP) data are excellent for the purpose; however, these data are only available at well locations, if they are recorded. As a result of many years of technology advancement, pseudo VSP data can be constructed effectively from standard streamer survey data. Two methods are highlighted in this paper for building pseudo VSP data in a full two-way sense, based on a typical Brazil-type salt model: Marchenko-based processing and full-wavefield migration. Major subsalt plays in the Gulf of Mexico and emerging plays in Brazil should benefit significantly from elimination of these kinds of coherent noise.


SPIN ◽  
2021 ◽  
Author(s):  
Mingyu Chen ◽  
Yu Zhang ◽  
Yongshang Li

In the NISQ era, quantum computers have insufficient qubits to support quantum error correction, which can only perform shallow quantum algorithms under noisy conditions. Aiming to improve the fidelity of quantum circuits, it is necessary to reduce the circuit depth as much as possible to mitigate the coherent noise. To address the issue, we propose PaF , a Pattern matching-based quantum circuit rewriting algorithm Framework to optimize quantum circuits. The algorithm framework finds all sub-circuits satisfied in the input quantum circuit according to the given external pattern description, then replaces them with better circuit implementations. To extend the capabilities of PaF , a general pattern description format is proposed to make rewriting patterns in existing work become machine-readable. In order to evaluate the effectiveness of PaF , we employ the BIGD benchmarks in QUEKO benchmark suite to test the performance and the result shows that PaF provides a maximal speedup of [Formula: see text] by using few patterns.


2021 ◽  
Vol 40 (10) ◽  
pp. 768-777
Author(s):  
Vemund S. Thorkildsen ◽  
Leiv-J. Gelius ◽  
Enders A. Robinson

If an optical hologram is broken into pieces, a virtual object can still be reconstructed from each of the fragments. This reconstruction is possible because each diffraction point emits waves that reach every point of the hologram. Thus, the entire object is encoded into each subset of the hologram. Analogous to the broken hologram, the use of undersampled seismic data violating the Nyquist-Shannon sampling theorem may still give a well-resolved image of the subsurface. A theoretical framework of this idea has already been introduced in the literature and denoted as holistic migration. However, the general lack of seismic field data demonstrations has inspired the study presented here. Since the optical hologram is diffraction-driven, we propose to employ diffraction-separated data and not conventional reflection data as input for holistic migration. We follow the original idea and regularly undersample the data spatially. Such a sampling strategy will result in coherent noise in the image domain. We therefore introduce a novel signal processing technique to remove such noise. The feasibility of the proposed approach is demonstrated employing the Sigsbee2a controlled data set and field data from the Barents Sea.


Author(s):  
Chao An ◽  
Chen Cai ◽  
Lei Zhou ◽  
Ting Yang

Abstract Horizontal records of ocean-bottom seismographs are usually noisy at low frequencies (< 0.1 Hz). The noise source is believed to be associated with ocean-bottom currents that may tilt the instrument. Currently horizontal records are mainly used to remove the coherent noise in vertical records, and there has been little literature that quantitatively discusses the mechanism and characteristics of low-frequency horizontal noise. In this article, we analyze in situ ocean-bottom measurements by rotating the data horizontally and evaluating the coherency between different channels. Results suggest that the horizontal noise consists of two components, random noise and principle noise whose direction barely changes in time. The amplitude and the direction of the latter are possibly related to the intensity and direction of ocean-bottom currents. Rotating the horizontal records to the direction of the principle noise can largely suppress the principle noise in the orthogonal horizontal channel. In addition, the horizontal noise is incoherent with pressure, indicating that the noise source is not ocean surface water waves (infragravity waves). At some stations in shallow waters (<300 m), horizontal noise around 0.07 Hz is found to be linearly proportional to the temporal derivative of pressure, which is explained by forces of added mass due to infragravity waves.


2021 ◽  
Author(s):  
Stephen K Chiu ◽  
Joel Latchman ◽  
Frank Gomez ◽  
Edward Wiarda ◽  
David Richards ◽  
...  
Keyword(s):  

Geophysics ◽  
2021 ◽  
pp. 1-129
Author(s):  
Omar M. Saad ◽  
Min Bai ◽  
Yangkang Chen

Localizing the microseismic event plays a key role in microseismic monitoring. However, microseismic data usually suffer from a low signal-to-noise ratio (S/N), which could affect the resolution of the microseismic source location. We have developed an unsupervised deep learning approach based on variational autoencoder (VAE) and squeeze-and-excitation (SE) networks for enhancing microseismic signals, as well as suppressing noise. First, the microseismic data are divided into several overlapped patches. Second, the VAE encodes the data, extracting the significant features related to the useful signals. Finally, the extracted latent features are decoded to uncover the useful signals and discard the others. The SE network is used to guide the VAE to preserve the useful information related to the clean signal by scaling the extracted features from the encoder part and concatenating them with the features of the decoder part. Our algorithm is evaluated using several synthetic and field examples. As a result, a robust denoising performance is shown despite the existence of a high level of random and coherent noise, for example, with an S/N as low as −32.45 dB. Then, the denoised signal can be used as input data to image the source location using a reverse time migration method, leading to better the location accuracy. Our algorithm performs the best when compared to benchmark methods such as f- x deconvolution and the damped multichannel singular spectrum analysis methods.


2021 ◽  
Vol 11 (16) ◽  
pp. 7218
Author(s):  
Qazi Sohail Imran ◽  
Numair A. Siddiqui ◽  
Abdul Halim Abdul Latiff ◽  
Yasir Bashir ◽  
Muhammad Khan ◽  
...  

3D-seismic data have increasingly shifted seismic interpretation work from a horizons-based to a volume-based focus over the past decade. The size of the identification and mapping work has therefore become difficult and requires faster and better tools. Faults, for instance, are one of the most significant features of subsurface geology interpreted from seismic data. Detailed fault interpretation is very important in reservoir characterization and modeling. The conventional manual fault picking is a time-consuming and inefficient process. It becomes more challenging and error-prone when dealing with poor quality seismic data under gas chimneys. Several seismic attributes are available for faults and discontinuity detection and are applied with varying degrees of success. We present a hybrid workflow that combines a semblance-based fault likelihood attribute with a conventional ant-tracking attribute. This innovative workflow generates optimized discontinuity volumes for fault detection and automatic extraction. The data optimization and conditioning processes are applied to suppress random and coherent noise first, and then a combination of seismic attributes is generated and co-rendered to enhance the discontinuities. The result is the volume with razor sharp discontinuities which are tracked and extracted automatically. Contrary to several available fault tracking techniques that use local seismic continuity like coherency attributes, our hybrid method is based on directed semblance, which incorporates aspects of Dave Hale’s superior fault-oriented semblance algorithm. The methodology is applied on a complex faulted reservoir interval under gas chimneys in a Malaysian basin, yet the results were promising. Despite the poor data quality, the methodology led to detailed discontinuity information with several major and minor faults extracted automatically. This hybrid approach not only improved the fault tracking accuracy but also significantly reduced the fault interpretation time and associated uncertainty. It is equally helpful in detecting any seismic objects like fracture, chimneys, and stratigraphic features.


Sign in / Sign up

Export Citation Format

Share Document