Local primary-and-multiple orthogonalization for leaked internal multiple crosstalk estimation and attenuation on full-wavefield migrated images

Geophysics ◽  
2020 ◽  
Vol 86 (1) ◽  
pp. A7-A13
Author(s):  
Dong Zhang ◽  
D. J. (Eric) Verschuur ◽  
Mikhail Davydenko ◽  
Yangkang Chen ◽  
Ali M. Alfaraj ◽  
...  

An important imaging challenge is creating reliable seismic images without internal multiple crosstalk, especially in cases with strong overburden reflectivity. Several data-driven methods have been proposed to attenuate the internal multiple crosstalk, for which fully sampled data in the source and receiver side are usually required. To overcome this acquisition constraint, model-driven full-wavefield migration (FWM) can automatically include internal multiples and only needs dense sampling in either the source or receiver side. In addition, FWM can correct for transmission effects at the reflecting interfaces. Although FWM has been shown to work effectively in compensating for transmission effects and suppressing internal multiple crosstalk compared to conventional least-squares primary wavefield migration (PWM), it tends to generate relatively weaker internal multiples during modeling. Therefore, some leaked internal multiple crosstalk can still be observed in the FWM image, which tends to blend in the background and can be misinterpreted as real geology. Thus, we adopted a novel framework using local primary-and-multiple orthogonalization (LPMO) on the FWM image as a postprocessing step for leaked internal multiple crosstalk estimation and attenuation. Due to their opposite correlation with the FWM image, a positive-only LPMO weight can be used to estimate the leaked internal multiple crosstalk, whereas a negative-only LPMO weight indicates the transmission effects that need to be retained. Application to North Sea field data validates the performance of the proposed framework for removing the weak but misleading leaked internal multiple crosstalk in the FWM image. Therefore, with this new framework, FWM can provide a reliable solution to the long-standing issue of imaging primaries and internal multiples automatically, with proper primary restoration.

Geophysics ◽  
2005 ◽  
Vol 70 (3) ◽  
pp. V45-V60 ◽  
Author(s):  
A. J. Berkhout ◽  
D. J. Verschuur

Removal of surface and internal multiples can be formulated by removing the influence of downward-scattering boundaries and downward-scattering layers. The involved algorithms can be applied in a model-driven or a data-driven way. A unified description is proposed that relates both types of algorithms based on wave theory. The algorithm for the removal of surface multiples shows that muted shot records play the role of multichannel prediction filters. The algorithm for the removal of internal multiples shows that muted CFP gathers play the role of multichannel prediction filters. The internal multiple removal algorithm is illustrated with numerical examples. The conclusion is that the layer-related version of the algorithm has significant practical advantages.


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2085
Author(s):  
Xue-Bo Jin ◽  
Ruben Jonhson Robert RobertJeremiah ◽  
Ting-Li Su ◽  
Yu-Ting Bai ◽  
Jian-Lei Kong

State estimation is widely used in various automated systems, including IoT systems, unmanned systems, robots, etc. In traditional state estimation, measurement data are instantaneous and processed in real time. With modern systems’ development, sensors can obtain more and more signals and store them. Therefore, how to use these measurement big data to improve the performance of state estimation has become a hot research issue in this field. This paper reviews the development of state estimation and future development trends. First, we review the model-based state estimation methods, including the Kalman filter, such as the extended Kalman filter (EKF), unscented Kalman filter (UKF), cubature Kalman filter (CKF), etc. Particle filters and Gaussian mixture filters that can handle mixed Gaussian noise are discussed, too. These methods have high requirements for models, while it is not easy to obtain accurate system models in practice. The emergence of robust filters, the interacting multiple model (IMM), and adaptive filters are also mentioned here. Secondly, the current research status of data-driven state estimation methods is introduced based on network learning. Finally, the main research results for hybrid filters obtained in recent years are summarized and discussed, which combine model-based methods and data-driven methods. This paper is based on state estimation research results and provides a more detailed overview of model-driven, data-driven, and hybrid-driven approaches. The main algorithm of each method is provided so that beginners can have a clearer understanding. Additionally, it discusses the future development trends for researchers in state estimation.


Geophysics ◽  
2014 ◽  
Vol 79 (3) ◽  
pp. WA107-WA115 ◽  
Author(s):  
Filippo Broggini ◽  
Roel Snieder ◽  
Kees Wapenaar

Standard imaging techniques rely on the single scattering assumption. This requires that the recorded data do not include internal multiples, i.e., waves that have bounced multiple times between reflectors before reaching the receivers at the acquisition surface. When multiple reflections are present in the data, standard imaging algorithms incorrectly image them as ghost reflectors. These artifacts can mislead interpreters in locating potential hydrocarbon reservoirs. Recently, we introduced a new approach for retrieving the Green’s function recorded at the acquisition surface due to a virtual source located at depth. We refer to this approach as data-driven wavefield focusing. Additionally, after applying source-receiver reciprocity, this approach allowed us to decompose the Green’s function at a virtual receiver at depth in its downgoing and upgoing components. These wavefields were then used to create a ghost-free image of the medium with either crosscorrelation or multidimensional deconvolution, presenting an advantage over standard prestack migration. We tested the robustness of our approach when an erroneous background velocity model is used to estimate the first-arriving waves, which are a required input for the data-driven wavefield focusing process. We tested the new method with a numerical example based on a modification of the Amoco model.


Geophysics ◽  
2021 ◽  
pp. 1-47
Author(s):  
Xueyi Jia ◽  
Anatoly Baumstein ◽  
Charlie Jing ◽  
Erik Neumann ◽  
Roel Snieder

Sub-basalt imaging for hydrocarbon exploration faces challenges with the presence of multiple scattering, attenuation and mode-conversion as seismic waves encounter highly heterogeneous and rugose basalt layers. A combination of modern seismic acquisition that can record densely-sampled data, and advanced imaging techniques make imaging through basalt feasible. Yet, the internal multiples, if not properly handled during seismic processing, can be mapped to reservoir layers by conventional imaging methods, misguiding geological interpretation. Traditional internal multiple elimination methods suffer from the requirement of picking horizons of multiple generators and/or a top-down adaptive subtraction process. Marchenko imaging provides an alternative solution to directly remove the artifacts due to internal multiples, without the need of horizon picking or subtraction. In this paper, we present a successful application of direct Marchenko imaging for sub-basalt de-multiple and imaging with an offshore Brazil field dataset. The internal multiples in this example are generated from the seabed and basalt layers, causing severe artifacts in conventional seismic images. We demonstrate that these artifacts are largely suppressed with Marchenko imaging and propose a general work flow for data pre-processing and regularization of marine streamer datasets. We show that horizontally propagating waves can also be reconstructed by the Marchenko method at far offsets.


Author(s):  
Nawfal El Moukhi ◽  
Ikram El Azami ◽  
Abdelaaziz Mouloudi ◽  
Abdelali Elmounadi

The data warehouse design is currently recognized as the most important and complicated phase in any project of decision support system implementation. Its complexity is primarily due to the proliferation of data source types and the lack of a standardized and well-structured method, hence the increasing interest from researchers who have tried to develop new methods for the automation and standardization of this critical stage of the project. In this paper, the authors present the set of developed methods that follows the data-driven paradigm, and they propose a new data-driven method called X-ETL. This method aims to automating the data warehouse design by generating star models from relational data. This method is mainly based on a set of rules derived from the related works, the Model-Driven Architecture (MDA) and the XML language.


Sign in / Sign up

Export Citation Format

Share Document