seismic signal processing
Recently Published Documents


TOTAL DOCUMENTS

65
(FIVE YEARS 5)

H-INDEX

10
(FIVE YEARS 0)

2019 ◽  
Vol 91 (1) ◽  
pp. 356-369
Author(s):  
Joshua Dickey ◽  
Brett Borghetti ◽  
William Junek ◽  
Richard Martin

Abstract Similarity search is a popular technique for seismic signal processing, with template matching, matched filters, and subspace detectors being utilized for a wide variety of tasks, including both signal detection and source discrimination. Traditionally, these techniques rely on the cross‐correlation function as the basis for measuring similarity. Unfortunately, seismogram correlation is dominated by path effects, essentially requiring a distinct waveform template along each path of interest. To address this limitation, we propose a novel measure of seismogram similarity that is explicitly invariant to path. Using Earthscope’s USArray experiment, a path‐rich dataset of 207,291 regional seismograms across 8452 unique events is constructed, and then employed via the batch‐hard triplet loss function, to train a deep convolutional neural network that maps raw seismograms to a low‐dimensional embedding space, where nearness on the space corresponds to nearness of source function, regardless of path or recording instrumentation. This path‐agnostic embedding space forms a new representation for seismograms, characterized by robust, source‐specific features, which we show to be useful for performing both pairwise event association as well as template‐based source discrimination with a single template.


Entropy ◽  
2018 ◽  
Vol 20 (10) ◽  
pp. 750 ◽  
Author(s):  
Jerry Gibson

Autoregressive processes play a major role in speech processing (linear prediction), seismic signal processing, biological signal processing, and many other applications. We consider the quantity defined by Shannon in 1948, the entropy rate power, and show that the log ratio of entropy powers equals the difference in the differential entropy of the two processes. Furthermore, we use the log ratio of entropy powers to analyze the change in mutual information as the model order is increased for autoregressive processes. We examine when we can substitute the minimum mean squared prediction error for the entropy power in the log ratio of entropy powers, thus greatly simplifying the calculations to obtain the differential entropy and the change in mutual information and therefore increasing the utility of the approach. Applications to speech processing and coding are given and potential applications to seismic signal processing, EEG classification, and ECG classification are described.


Sign in / Sign up

Export Citation Format

Share Document