scholarly journals Application of Machine Learning for Quality Control of Noise Attenuation Processes in Seismic Data Imaging

Author(s):  
Mohamed Mejri ◽  
Aymen Mejri ◽  
Maiza Bekara

Seismic imaging is the main technology used for subsurface hydrocarbon prospection. It provides an image of the subsurface using the same principles as ultrasound medical imaging. It is based on emitting a sound (pressure) wave through the subsurface and recording the reflected echoes using hydrophones (pressure sensors) and/or geophones (velocity/acceleration sensors). Contrary to medical imaging, which is done in real time, subsurface seismic imaging is an offline process that involves a huge volume of data and needs considerable computing power. The raw seismic data are heavily contaminated with noise and unwanted reflections that need to be removed before further processing. Therefore, the noise attenuation is done at an early stage and often while acquiring the data. Quality control (QC) is mandatory to give confidence in the denoising process and to ensure that a costly data re-acquisition is not needed. QC is done manually by humans and comprises a major portion of the cost of a typical seismic processing project. It is therefore advantageous to automate this process to improve cost and efficiency. Here, we propose a supervised learning approach to build an automatic QC system. The QC system is an attribute-based classifier that is trained to classify three types of filtering (mild = underfiltering, noise remaining in the data; optimal = good filtering; harsh = overfiltering, the signal is distorted). The attributes are computed from the data and represent geophysical and statistical measures of the quality of the filtering. The system is tested on a full-scale survey (9000 km2) to QC the results of the swell noise attenuation process in marine seismic data. The results are encouraging and helped identify localized issues that were difficult for a human to spot.

Geosciences ◽  
2020 ◽  
Vol 10 (12) ◽  
pp. 475
Author(s):  
Mohamed Mejri ◽  
Maiza Bekara

Seismic imaging is the main technology used for subsurface hydrocarbon prospection. It provides an image of the subsurface using the same principles as ultrasound medical imaging. As for any data acquired through hydrophones (pressure sensors) and/or geophones (velocity/acceleration sensors), the raw seismic data are heavily contaminated with noise and unwanted reflections that need to be removed before further processing. Therefore, the noise attenuation is done at an early stage and often while acquiring the data. Quality control (QC) is mandatory to give confidence in the denoising process and to ensure that a costly data re-acquisition is not needed. QC is done manually by humans and comprises a major portion of the cost of a typical seismic processing project. It is therefore advantageous to automate this process to improve cost and efficiency. Here, we propose a supervised learning approach to build an automatic QC system. The QC system is an attribute-based classifier that is trained to classify three types of filtering (mild = under filtering, noise remaining in the data; optimal = good filtering; harsh = over filtering, the signal is distorted). The attributes are computed from the data and represent geophysical and statistical measures of the quality of the filtering. The system is tested on a full-scale survey (9000 km2) to QC the results of the swell noise attenuation process in marine seismic data.


Author(s):  
mohamed mejri ◽  
Maiza Bekara

Seismic imaging is the main technology used for subsurface hydrocarbon prospection. It~provides an image of the subsurface using the same principles as ultrasound medical imaging. As for any data acquired through hydrophones (pressure sensors) and/or geophones (velocity/acceleration sensors), the raw seismic data are heavily contaminated with noise and unwanted reflections that need to be removed before further processing. Therefore, the noise attenuation is done at an early stage and often while acquiring the data. Quality control (QC) is mandatory to give confidence in the denoising process and to ensure that a costly data re-acquisition is not needed. QC is done manually by humans and comprises a major portion of the cost of a typical seismic processing project. It is therefore advantageous to automate this process to improve cost and efficiency. Here, we propose a supervised learning approach to build an automatic QC system. The~QC system is an attribute-based classifier that is trained to classify three types of filtering (mild = under filtering, noise remaining in the data; optimal = good filtering; harsh = over filtering, the signal is distorted). The attributes are computed from the data and represent geophysical and statistical measures of the quality of the filtering. The system is tested on a full-scale survey (9000 km2) to QC the results of the swell noise attenuation process in marine seismic data.


2019 ◽  
pp. 1297-1303
Author(s):  
Kamal K. Ali ◽  
Reem K. Ibrahim ◽  
Hassan A. Thabit

The frequency dependent noise attenuation (FDNAT) filter was applied on 2D seismic data line DE21 in east Diwaniya, south eastern Iraq to improve the signal to noise ratio. After applied FDNAT on the seismic data, it gives good results and caused to remove a lot of random noise. This processing is helpful in enhancement the picking of the signal of the reflectors and therefore the interpretation of data will be easy later. The quality control by using spectrum analysis is used as a quality factor in proving the effects of FDNAT filter to remove the random noise.


2017 ◽  
Vol 6 (2) ◽  
pp. 505-521 ◽  
Author(s):  
Luděk Vecsey ◽  
Jaroslava Plomerová ◽  
Petr Jedlička ◽  
Helena Munzarová ◽  
Vladislav Babuška ◽  
...  

Abstract. This paper focuses on major issues related to the data reliability and network performance of 20 broadband (BB) stations of the Czech (CZ) MOBNET (MOBile NETwork) seismic pool within the AlpArray seismic experiments. Currently used high-resolution seismological applications require high-quality data recorded for a sufficiently long time interval at seismological observatories and during the entire time of operation of the temporary stations. In this paper we present new hardware and software tools we have been developing during the last two decades while analysing data from several international passive experiments. The new tools help to assure the high-quality standard of broadband seismic data and eliminate potential errors before supplying data to seismological centres. Special attention is paid to crucial issues like the detection of sensor misorientation, timing problems, interchange of record components and/or their polarity reversal, sensor mass centring, or anomalous channel amplitudes due to, for example, imperfect gain. Thorough data quality control should represent an integral constituent of seismic data recording, preprocessing, and archiving, especially for data from temporary stations in passive seismic experiments. Large international seismic experiments require enormous efforts from scientists from different countries and institutions to gather hundreds of stations to be deployed in the field during a limited time period. In this paper, we demonstrate the beneficial effects of the procedures we have developed for acquiring a reliable large set of high-quality data from each group participating in field experiments. The presented tools can be applied manually or automatically on data from any seismic network.


2019 ◽  
Vol 2 (3) ◽  
pp. 147-153
Author(s):  
Georgy Loginov ◽  
Anton Duchkov ◽  
Dmitry Litvichenko ◽  
Sergey Alyamkin

The paper considers the use of a convolution neural network for detecting first arrivals for a real set of 3D seismic data with more than 4.5 million traces. Detection of the first breaks for each trace is carried out independently. The error between the original and the predicted first breaks is no more than 3 samples for 95% of the data. Quality control is performed by calculating static corrections and seismic stacks, which showed the effectiveness of the proposed approach.


2017 ◽  
Author(s):  
Luděk Vecsey ◽  
Jaroslava Plomerová ◽  
Petr Jedlička ◽  
Helena Munzarová ◽  
Vladislav Babuška ◽  
...  

Abstract. This paper focuses on major issues related to data reliability and MOBNET network performance in the AlpArray seismic experiments, in which twenty temporary broad-band stations of the Czech MOBNET pool of mobile stations have been involved. Currently used high-resolution scientific methods require high-quality data recorded for a sufficiently long time interval at observatories and during full time of operation of temporary stations. In this paper we present both new hardware and software tools that help to assure the high-quality standard of broad-band seismic data. Special attention is paid to issues like a detection of sensor mis-orientation, timing problems, exchange of record components and/or their polarity reversal, sensor mass centring, or anomalous channel amplitudes due to, e.g., imperfect gain. Thorough data-quality control should represent an integral constituent of seismic data recording, pre-processing and archiving, especially for data from temporary stations in passive seismic experiments. Large international seismic experiments require enormous efforts of scientists from different countries and institutions to gather hundreds of stations to be deployed in the field during a limited time period. In this paper, we demonstrate beneficial effects of the procedures we have developed for having a sufficiently large set of high-quality and reliable data from each group participating in field experiments.


2021 ◽  
Vol 6 (3) ◽  
pp. 83-96
Author(s):  
Gleb S. Chernyshov ◽  
Anton A. Duchkov ◽  
Dmitriy A. Litvichenko ◽  
Mihail V. Salishchev ◽  
Daniil G. Semin ◽  
...  

Introduction. The aim of seismic exploration is to build a depth-velocity geological model based on the joint interpretation of seismic and well data. Seismic exploration provides uniform coverage of the studied area, and borehole data provide more complete and accurate information about the studied section at a discrete set of points (well locations). The results of the main stages of seismic data processing undergo quality control during interpretation support (ISO). The supporting task is to quickly carry out quality control (QC) at different stages of processing, starting from the earliest. Early detection of possible errors and selection of the optimal parameters of the procedures ensures high quality materials at the end of the processing stage. Seismic interpretation relies on the use of well data in conjunction with the seismic data and various attributes within a single interpretation package. At the same time, seismic processing and interpretation are historically separated by different software packages. Methods. The aim of the work was to create software tools that facilitate the interaction between processing and interpretation. The developed tools should: 1) include the functionality of interpretation packages necessary for interpretation support various stages of processing, 2) be able to access seismic data directly from the software for processing. Results and discussion. Successful testing of the created software tools showed the possibility of performing the necessary analysis without using specialized interpretation packages. The implemented software packages generate a report with QC metrics and figures, which the interpreter can view and make conclusions about the current stage of the processing.


Sign in / Sign up

Export Citation Format

Share Document