scholarly journals Sensitivity analysis of the backprojection imaging method for seismic event location

2021 ◽  
Vol 11 (1) ◽  
pp. 21-32
Author(s):  
Cristian Alexis Murillo Martínez ◽  
William Mauricio Agudelo

Accuracy of earthquake location methods is dependent upon the quality of input data. In the real world, several sources of uncertainty, such as incorrect velocity models, low Signal to Noise Ratio (SNR), and poor coverage, affect the solution. Furthermore, some complex seismic signals exist without distinguishable phases for which conventional location methods are not applicable. In this work, we conducted a sensitivity analysis of Back-Projection Imaging (BPI), which is a technique suitable for location of conventional seismicity, induced seismicity, and tremor-like signals. We performed a study where synthetic data is modelled as fixed spectrum explosive sources. The purpose of using such simplified signals is to fully understand the mechanics of the location method in controlled scenarios, where each parameter can be freely perturbed to ensure that their individual effects are shown separately on the outcome. The results suggest the need for data conditioning such as noise removal to improve image resolution and minimize artifacts. Processing lower frequency signal increases stability, while higher frequencies improve accuracy. In addition, a good azimuthal coverage reduces the spatial location error of seismic events, where, according to our findings, depth is the most sensitive spatial coordinate to velocity and geometry changes.

2020 ◽  
Vol 223 (2) ◽  
pp. 1313-1326
Author(s):  
S J Gibbons ◽  
T Kværna ◽  
T Tiira ◽  
E Kozlovskaya

Summary ‘Precision seismology’ encompasses a set of methods which use differential measurements of time-delays to estimate the relative locations of earthquakes and explosions. Delay-times estimated from signal correlations often allow far more accurate estimates of one event location relative to another than is possible using classical hypocentre determination techniques. Many different algorithms and software implementations have been developed and different assumptions and procedures can often result in significant variability between different relative event location estimates. We present a Ground Truth (GT) dataset of 55 military surface explosions in northern Finland in 2007 that all took place within 300 m of each other. The explosions were recorded with a high signal-to-noise ratio to distances of about 2°, and the exceptional waveform similarity between the signals from the different explosions allows for accurate correlation-based time-delay measurements. With exact coordinates for the explosions, we are able to assess the fidelity of relative location estimates made using any location algorithm or implementation. Applying double-difference calculations using two different 1-D velocity models for the region results in hypocentre-to-hypocentre distances which are too short and it is clear that the wavefield leaving the source region is more complicated than predicted by the models. Using the GT event coordinates, we are able to measure the slowness vectors associated with each outgoing ray from the source region. We demonstrate that, had such corrections been available, a significant improvement in the relative location estimates would have resulted. In practice we would of course need to solve for event hypocentres and slowness corrections simultaneously, and significant work will be needed to upgrade relative location algorithms to accommodate uncertainty in the form of the outgoing wavefield. We present this data set, together with GT coordinates, raw waveforms for all events on six regional stations, and tables of time-delay measurements, as a reference benchmark by which relative location algorithms and software can be evaluated.


2020 ◽  
Author(s):  
Tormod Kvaerna ◽  
Steven J. Gibbons ◽  
Timo Tiira ◽  
Elena Kozlovskaya

<p>"Precision seismology'' encompasses a set of methods which use differential measurements of time-delays to estimate the relative locations of earthquakes and explosions.  Delay-times estimated from signal correlations often allow far more accurate estimates of one event location relative to another than is possible using classical hypocenter determination techniques.  Many different algorithms and software implementations have been developed and different assumptions and procedures can often result in significant variability between different relative event location estimates.  We present a Ground Truth (GT) database of 55 military surface explosions in northern Finland in 2007 that all took place within 300 meters of each other.  The explosions were recorded with a high signal-to-noise ratio to distances of about 2 degrees, and the exceptional waveform similarity between the signals from the different explosions allows for accurate correlation-based time-delay measurements.  With exact coordinates for the explosions, we can assess the fidelity of relative location estimates made using any location algorithm or implementation.  Applying double-difference calculations using two different 1-d velocity models for the region results in hypocenter-to-hypocenter distances which are too short and the wavefield leaving the source region is more complicated than predicted by the models.  Using the GT event coordinates, we can measure the slowness vectors associated with each outgoing ray from the source region. We demonstrate that, had such corrections been available, a significant improvement in the relative location estimates would have resulted.  In practice we would of course need to solve for event hypocenters and slowness corrections simultaneously, and significant work will be needed to upgrade relative location algorithms to accommodate uncertainty in the form of the outgoing wavefield.  We present this dataset, together with GT coordinates, raw waveforms for all events on six regional stations, and tables of time-delay measurements, as a reference benchmark by which relative location algorithms and software can be evaluated.</p>


Geophysics ◽  
1979 ◽  
Vol 44 (10) ◽  
pp. 1637-1660 ◽  
Author(s):  
Robert A. Phinney ◽  
Donna M. Jurdy

We introduce here an integral two‐dimensional (2-D) scheme for the processing of deep crustal reflection profiles. This approach, in which migration occurs before stacking, is tailored to the unique character of the data in which nonvertically propagating energy is as important as vertically propagating energy. Since reflector depths range beyond 30 km, the horizontal displacement of reflections which occurs in migration can be as large as reflector depths; under these circumstances, the common‐midpoint (CMP) stack is inadequate. In our scheme, each common‐source trace gather is transformed into a set of traces (beams) corresponding to set of different incidence angles. A correction for wavefront curvature similar to the normal moveout (NMO) correction yields traces (focused beams) which are focused at image points along the direction of arrival. While the method is equivalent to the Kirchhoff integral migration method, and therefore to any complete continuation method, it gives rise to an intermediate data set which is characterized by the direction of arrival of the upward propagating energy. By a geometrical transformation of the beams and summation, we may synthesize images composed of a specified range of Fourier spatial components. Geologic examples suggest that complex structures in the basement may be most easily characterized by their local direction of layering, a quantity we may determine by this approach. Noise‐free synthetic data examples illustrate the limits of horizontal and vertical resolving power at mid‐crustal depths for any imaging method. Velocity determination is difficult at these depths due to the small NMO and may be possible only by evaluating the effects of velocity models on the imaged data. Examples of the imaged section from the COCORP test profile in Hardeman County, Texas, show a combination of horizontally continuous reflectors and an irregular pattern of scatterers with locally horizontal layering.


2021 ◽  
Vol 11 (10) ◽  
pp. 4433
Author(s):  
Wenjin Xu ◽  
Maodan Yuan ◽  
Weiming Xuan ◽  
Xuanrong Ji ◽  
Yan Chen

Ultrasonic methods have been extensively developed in nondestructive testing for various materials and components. However, accurately extracting quantitative information about defects still remains challenging, especially for complex structures. Although the immersion technique is commonly used for complex-shaped parts, the large mismatch of acoustic impedance between water and metal prevents effective ultrasonic transmission and leads to a low signal-to-noise ratio(SNR). In this paper, a quantitative imaging method is proposed for complex-shaped parts based on an ice-coupled full waveform inversion (FWI) method. Numerical experiments were carried out to quantitatively inspect the various defects in a turbine blade. Firstly, the k-space pseudospectral method was applied to simulate ice-coupled ultrasonic testing for the turbine blade. The recorded full wavefields were then applied for a frequency-domain FWI based on the Limited-memory Broyden–Fletcher–Goldfarb–Shanno (L-BFGS) method. With a carefully selected iterative number and frequency, a successive-frequency FWI can well detect half wavelength defects. Extended studies on an open notch with different orientations and multiple adjacent defects proved its capability to detect different types of defects. Finally, an uncertainty analysis was conducted with inaccurate initial velocity models with a relative error of ± 2%, demonstrating its robustness even with a certain inaccuracy. This study demonstrates that the proposed method has a high potential to inspect complex-shaped structures with an excellent resolution.


Author(s):  
S. Chef ◽  
C. T. Chua ◽  
C. L. Gan

Abstract Limited spatial resolution and low signal to noise ratio are some of the main challenges in optical signal observation, especially for photon emission microscopy. As dynamic emission signals are generated in a 3D space, the use of the time dimension in addition to space enables a better localization of switching events. It can actually be used to infer information with a precision above the resolution limits of the acquired signals. Taking advantage of this property, we report on a post-acquisition processing scheme to generate emission images with a better image resolution than the initial acquisition.


2021 ◽  
Vol 11 (2) ◽  
pp. 790
Author(s):  
Pablo Venegas ◽  
Rubén Usamentiaga ◽  
Juan Perán ◽  
Idurre Sáez de Ocáriz

Infrared thermography is a widely used technology that has been successfully applied to many and varied applications. These applications include the use as a non-destructive testing tool to assess the integrity state of materials. The current level of development of this application is high and its effectiveness is widely verified. There are application protocols and methodologies that have demonstrated a high capacity to extract relevant information from the captured thermal signals and guarantee the detection of anomalies in the inspected materials. However, there is still room for improvement in certain aspects, such as the increase of the detection capacity and the definition of a detailed characterization procedure of indications, that must be investigated further to reduce uncertainties and optimize this technology. In this work, an innovative thermographic data analysis methodology is proposed that extracts a greater amount of information from the recorded sequences by applying advanced processing techniques to the results. The extracted information is synthesized into three channels that may be represented through real color images and processed by quaternion algebra techniques to improve the detection level and facilitate the classification of defects. To validate the proposed methodology, synthetic data and actual experimental sequences have been analyzed. Seven different definitions of signal-to-noise ratio (SNR) have been used to assess the increment in the detection capacity, and a generalized application procedure has been proposed to extend their use to color images. The results verify the capacity of this methodology, showing significant increments in the SNR compared to conventional processing techniques in thermographic NDT.


Geophysics ◽  
1988 ◽  
Vol 53 (3) ◽  
pp. 346-358 ◽  
Author(s):  
Greg Beresford‐Smith ◽  
Rolf N. Rango

Strongly dispersive noise from surface waves can be attenuated on seismic records by Flexfil, a new prestack process which uses wavelet spreading rather than velocity as the criterion for noise discrimination. The process comprises three steps: trace‐by‐trace compression to collapse the noise to a narrow fan in time‐offset (t-x) space; muting of the noise in this narrow fan; and inverse compression to recompress the reflection signals. The process will work on spatially undersampled data. The compression is accomplished by a frequency‐domain, linear operator which is independent of trace offset. This operator is the basis of a robust method of dispersion estimation. A flexural ice wave occurs on data recorded on floating ice in the near offshore of the North Slope of Alaska. It is both highly dispersed and of broad frequency bandwidth. Application of Flexfil to these data can increase the signal‐to‐noise ratio up to 20 dB. A noise analysis obtained from a microspread record is ideal to use for dispersion estimation. Production seismic records can also be used for dispersion estimation, with less accurate results. The method applied to field data examples from Alaska demonstrates significant improvement in data quality, especially in the shallow section.


Geophysics ◽  
1997 ◽  
Vol 62 (4) ◽  
pp. 1226-1237 ◽  
Author(s):  
Irina Apostoiu‐Marin ◽  
Andreas Ehinger

Prestack depth migration can be used in the velocity model estimation process if one succeeds in interpreting depth events obtained with erroneous velocity models. The interpretational difficulty arises from the fact that migration with erroneous velocity does not yield the geologically correct reflector geometries and that individual migrated images suffer from poor signal‐to‐noise ratio. Moreover, migrated events may be of considerable complexity and thus hard to identify. In this paper, we examine the influence of wrong velocity models on the output of prestack depth migration in the case of straight reflector and point diffractor data in homogeneous media. To avoid obscuring migration results by artifacts (“smiles”), we use a geometrical technique for modeling and migration yielding a point‐to‐point map from time‐domain data to depth‐domain data. We discover that strong deformation of migrated events may occur even in situations of simple structures and small velocity errors. From a kinematical point of view, we compare the results of common‐shot and common‐offset migration. and we find that common‐offset migration with erroneous velocity models yields less severe image distortion than common‐shot migration. However, for any kind of migration, it is important to use the entire cube of migrated data to consistently interpret in the prestack depth‐migrated domain.


2015 ◽  
Vol 2015 ◽  
pp. 1-9 ◽  
Author(s):  
Jae Heon Kim ◽  
Hong J. Lee ◽  
Yun Seob Song

A reliablein vivoimaging method to localize transplanted cells and monitor their viability would enable a systematic investigation of cell therapy. Most stem cell transplantation studies have used immunohistological staining, which does not provide information about the migration of transplanted cellsin vivoin the same host. Molecular imaging visualizes targeted cells in a living host, which enables determining the biological processes occurring in transplanted stem cells. Molecular imaging with labeled nanoparticles provides the opportunity to monitor transplanted cells noninvasively without sacrifice and to repeatedly evaluate them. Among several molecular imaging techniques, magnetic resonance imaging (MRI) provides high resolution and sensitivity of transplanted cells. MRI is a powerful noninvasive imaging modality with excellent image resolution for studying cellular dynamics. Several types of nanoparticles including superparamagnetic iron oxide nanoparticles and magnetic nanoparticles have been used to magnetically label stem cells and monitor viability by MRI in the urologic field. This review focuses on the current role and limitations of MRI with labeled nanoparticles for tracking transplanted stem cells in urology.


2021 ◽  
Author(s):  
Tianhua Zhang ◽  
Shiduo Yang ◽  
Chandramani Shrivastava ◽  
Adrian A ◽  
Nadege Bize-Forest

Abstract With the advancement of LWD (Logging While Drilling) hardware and acquisition, the imaging technology becomes not only an indispensable part of the drilling tool string, but also the image resolution increases to map layers and heterogeneity features down to less than 5mm scale. This shortens the geological interpretation turn-around time from wireline logging time (hours to days after drilling) to semi-real time (drilling time or hours after drilling). At the same time, drilling motion is complex. The depth tracking is on the surface referenced to the surface block movement. The imaging sensor located downhole can be thousands of feet away from the surface. Mechanical torque and drag, wellbore friction, wellbore temperature and weight on bit can make the downhole sensor movement motion not synchronized with surface pipe depth. This will cause time- depth conversion step generate image artifacts that either stop real-time interpretation of geological features or mis-interpret features on high resolution images. In this paper, we present several LWD images featuring distortion mechanism during the drilling process using synthetic data. We investigated how heave, depth reset and downhole sensor stick/slip caused image distortions. We provide solutions based on downhole sensor pseudo velocity computation to minimize the image distortion. The best practice in using Savitsky-Golay filter are presented in the discussion sections. Finally, some high-resolution LWD images distorted with drilling-related artifacts and processed ones are shown to demonstrate the importance of image post-processing. With the proper processed images, we can minimize interpretation risks and make drilling decisions with more confidence.


Sign in / Sign up

Export Citation Format

Share Document