Deconvolution of seismograms by the autoregressive method

Geophysics ◽  
1983 ◽  
Vol 48 (2) ◽  
pp. 229-233 ◽  
Author(s):  
G. Jayachandran Nair

The purpose of deconvolution in seismology is to estimate the basic seismic wavelet and the transfer function of the transmission medium. In reflection seismology, this transfer function refers to the reflectivity function, while in seismograms of earthquakes and explosions it represents the combined effects of the source crust and the receiver crust responses along with the attenuation function. Some of the techniques used for deconvolution of discrete time series data are Wiener inverse filtering (Robinson and Treitel, 1967), homomorphic deconvolution (Ulrych, 1971), and Kalman filtering (Crump, 1974). In the present paper, a method of deconvolution of single‐channel seismic data based on an autoregressive (AR) model of the time series is discussed. With it one can estimate the primary pulse and the deconvolution function simultaneously in an objective manner. Examples are provided to substantiate the applicability of the method using synthetic data simulating single and multiple explosions. The method is also applied to actual data for a presumed underground explosion from Eastern Kazakh.

2017 ◽  
Vol 1 ◽  
pp. 41-54 ◽  
Author(s):  
Amrit Subedi

Background: There are various approaches of modeling on time series data. Most of the studies conducted regarding time series data are based on annual trend whereas very few concerned with data having monthly fluctuation. The data of tourist arrivals is an example of time series data with monthly fluctuation which reveals that there is higher number of tourist arrivals in some months/seasons whereas others have less number. Starting from January, it makes a complete cycle in every 12 months with 3 bends indicating that it can be captured by biquadratic function.Objective: To provide an alternative approach of modeling i.e. combination of Autoregressive model with polynomial (biquadratic) function on time series data with monthly/seasonal fluctuation and compare its adequacy with widely used cyclic autoregressive model i.e. AR (12).Materials and Methods: This study is based on monthly data of tourist arrivals in Nepal. Firstly, usual time series model AR (12) has been adopted and an alternative approach of modeling has been attempted combining AR and biquadratic function. The first part of the model i.e. AR represents annual trend whereas biquadratic part does for monthly fluctuation.Results: The fitted cyclic autoregressive model on monthly data of tourist arrivals is Est. Ym = 3614.33 + 0.9509Ym-12, (R2=0.80); Est. Ym indicates predicted tourist arrivals for mth month and Ym-12 indicates observed tourist arrivals in (m-12)th month and the combined model of AR and biquadratic function is Est. Yt(m) = -46464.6 + 1.000Yt-1 + 52911.56m - 17177m2 + 2043.95m3 - 79.43m4, (R2=0.78); Est. Yt(m) indicates predicted tourist arrivals for mth month of tth year and Yt-1 indicates average tourist arrivals in (t-1)th year. The AR model combined with polynomial function reveals normal and homoscedastic residuals more accurately compared to first one.Conclusion: The use of polynomial function combined with autoregressive model can be useful for time series data having seasonal fluctuation. It can be an alternative approach for picking up a good model for such type of data. Nepalese Journal of Statistics, 2017,  Vol. 1, 41-54


2017 ◽  
Vol 20 (2) ◽  
pp. 190-202 ◽  
Author(s):  
Kannan S. ◽  
Somasundaram K.

Purpose Due to the large-size, non-uniform transactions per day, the money laundering detection (MLD) is a time-consuming and difficult process. The major purpose of the proposed auto-regressive (AR) outlier-based MLD (AROMLD) is to reduce the time consumption for handling large-sized non-uniform transactions. Design/methodology/approach The AR-based outlier design produces consistent asymptotic distributed results that enhance the demand-forecasting abilities. Besides, the inter-quartile range (IQR) formulations proposed in this paper support the detailed analysis of time-series data pairs. Findings The prediction of high-dimensionality and the difficulties in the relationship/difference between the data pairs makes the time-series mining as a complex task. The presence of domain invariance in time-series mining initiates the regressive formulation for outlier detection. The deep analysis of time-varying process and the demand of forecasting combine the AR and the IQR formulations for an effective outlier detection. Research limitations/implications The present research focuses on the detection of an outlier in the previous financial transaction, by using the AR model. Prediction of the possibility of an outlier in future transactions remains a major issue. Originality/value The lack of prior segmentation of ML detection suffers from dimensionality. Besides, the absence of boundary to isolate the normal and suspicious transactions induces the limitations. The lack of deep analysis and the time consumption are overwhelmed by using the regression formulation.


2018 ◽  
Author(s):  
Elijah Bogart ◽  
Richard Creswell ◽  
Georg K. Gerber

AbstractLongitudinal studies are crucial for discovering casual relationships between the microbiome and human disease. We present Microbiome Interpretable Temporal Rule Engine (MITRE), the first machine learning method specifically designed for predicting host status from microbiome time-series data. Our method maintains interpretability by learning predictive rules over automatically inferred time-periods and phylogenetically related microbes. We validate MITRE’s performance on semi-synthetic data, and five real datasets measuring microbiome composition over time in infant and adult cohorts. Our results demonstrate that MITRE performs on par or outperforms “black box” machine learning approaches, providing a powerful new tool enabling discovery of biologically interpretable relationships between microbiome and human host.


Author(s):  
Kazuhiro Ozawa ◽  
◽  
’Takahide Niimura ◽  
Tomoaki Nakashima ◽  

In this paper, the authors present a data analysis and estimation procedure of electrical power consumption under uncertain conditions. Tiraditional methods are based on statistical and probabilistic approaches but it may not be quite suitable to apply purely stochastic models to the data generated by human activities such as the power consumption. The authors introduce a new approach based on possibility theory and fuzzy autoregression, and apply it to the analysis of time-series data of electric power consumption. Two models, which are different in complexity, are presented, and the performance of the models are evaluated by vagueness and α-cuts. The proposed fuzzy Auoregression model represents the rich information of uncertainty that the original data contain, and it can be a powerful tool for flexible decision-making with uncertainty. The fuzzy AR model can also be constructed in relatively simple procedure compared with the conventional approaches.


2020 ◽  
Vol 35 (5) ◽  
pp. 439-451 ◽  
Author(s):  
Elan Ness-Cohn ◽  
Marta Iwanaszko ◽  
William L. Kath ◽  
Ravi Allada ◽  
Rosemary Braun

The circadian rhythm drives the oscillatory expression of thousands of genes across all tissues, coordinating physiological processes. The effect of this rhythm on health has generated increasing interest in discovering genes under circadian control by searching for periodic patterns in transcriptomic time-series experiments. While algorithms for detecting cycling transcripts have advanced, there remains little guidance quantifying the effect of experimental design and analysis choices on cycling detection accuracy. We present TimeTrial, a user-friendly benchmarking framework using both real and synthetic data to investigate cycle detection algorithms’ performance and improve circadian experimental design. Results show that the optimal choice of analysis method depends on the sampling scheme, noise level, and shape of the waveform of interest and provides guidance on the impact of sampling frequency and duration on cycling detection accuracy. The TimeTrial software is freely available for download and may also be accessed through a web interface. By supplying a tool to vary and optimize experimental design considerations, TimeTrial will enhance circadian transcriptomics studies.


BMC Neurology ◽  
2019 ◽  
Vol 19 (1) ◽  
Author(s):  
Helia Mahzoun Alzakerin ◽  
Yannis Halkiadakis ◽  
Kristin D. Morgan

Abstract Background Huntington’s disease (HD) is a progressive, neurological disorder that results in both cognitive and physical impairments. These impairments affect an individual’s gait and, as the disease progresses, it significantly alters one’s stability. Previous research found that changes in stride time patterns can help delineate between healthy and pathological gait. Autoregressive (AR) modeling is a statistical technique that models the underlying temporal patterns in data. Here the AR models assessed differences in gait stride time pattern stability between the controls and individuals with HD. Differences in stride time pattern stability were determined based on the AR model coefficients and their placement on a stationarity triangle that provides a visual representation of how the patterns mean, variance and autocorrelation change with time. Thus, individuals who exhibit similar stride time pattern stability will reside in the same region of the stationarity triangle. It was hypothesized that individuals with HD would exhibit a more altered stride time pattern stability than the controls based on the AR model coefficients and their location in the stationarity triangle. Methods Sixteen control and twenty individuals with HD performed a five-minute walking protocol. Time series’ were constructed from consecutive stride times extracted during the protocol and a second order AR model was fit to the stride time series data. A two-sample t-test was performed on the stride time pattern data to identify differences between the control and HD groups. Results The individuals with HD exhibited significantly altered stride time pattern stability than the controls based on their AR model coefficients (AR1 p < 0.001; AR2 p < 0.001). Conclusions The AR coefficients successfully delineated between the controls and individuals with HD. Individuals with HD resided closer to and within the oscillatory region of the stationarity triangle, which could be reflective of the oscillatory neuronal activity commonly observed in this population. The ability to quantitatively and visually detect differences in stride time behavior highlights the potential of this approach for identifying gait impairment in individuals with HD.


2021 ◽  
pp. 1-20
Author(s):  
Fabian Kai-Dietrich Noering ◽  
Yannik Schroeder ◽  
Konstantin Jonas ◽  
Frank Klawonn

In technical systems the analysis of similar situations is a promising technique to gain information about the system’s state, its health or wearing. Very often, situations cannot be defined but need to be discovered as recurrent patterns within time series data of the system under consideration. This paper addresses the assessment of different approaches to discover frequent variable-length patterns in time series. Because of the success of artificial neural networks (NN) in various research fields, a special issue of this work is the applicability of NNs to the problem of pattern discovery in time series. Therefore we applied and adapted a Convolutional Autoencoder and compared it to classical nonlearning approaches based on Dynamic Time Warping, based on time series discretization as well as based on the Matrix Profile. These nonlearning approaches have also been adapted, to fulfill our requirements like the discovery of potentially time scaled patterns from noisy time series. We showed the performance (quality, computing time, effort of parametrization) of those approaches in an extensive test with synthetic data sets. Additionally the transferability to other data sets is tested by using real life vehicle data. We demonstrated the ability of Convolutional Autoencoders to discover patterns in an unsupervised way. Furthermore the tests showed, that the Autoencoder is able to discover patterns with a similar quality like classical nonlearning approaches.


Author(s):  
Yoshiyuki Yabuuchi ◽  

The fuzzy autocorrelation model is a fuzzified autoregressive (AR) model. The aim of the fuzzy autocorrelation model is to describe the possible states of the system with high accuracy. This model uses autocorrelation similar to the Box–Jenkins model. The fuzzy autocorrelation model occasionally increases the vagueness. Although the problem can be mitigated using fuzzy confidence intervals instead of fuzzy time-series data, the unnatural estimations do not improve. Subsequently, an alternate method was used to fuzzify the time-series data and mitigate the unnatural estimation problem. This method also improved the model prediction accuracy. This paper focuses on fuzzification method, and discusses the prediction accuracy of the model and fuzzification of the time-series data. The analysis of the Nikkei stock average shows a high prediction accuracy and manageability of a fuzzy autocorrelation model. In this pape, a quartile is employed as an alternate fuzzification method. The model prediction accuracy and estimation behavior are verified through an analysis. Finally, the proposed method was found to be successful in mitigating the problems.


2019 ◽  
Vol 29 (2) ◽  
pp. 375-392 ◽  
Author(s):  
Pierre-Francois Marteau

Abstract In the light of regularized dynamic time warping kernels, this paper re-considers the concept of a time elastic centroid for a set of time series. We derive a new algorithm based on a probabilistic interpretation of kernel alignment matrices. This algorithm expresses the averaging process in terms of stochastic alignment automata. It uses an iterative agglomerative heuristic method for averaging the aligned samples, while also averaging the times of their occurrence. By comparing classification accuracies for 45 heterogeneous time series data sets obtained by first nearest centroid/medoid classifiers, we show that (i) centroid-based approaches significantly outperform medoid-based ones, (ii) for the data sets considered, our algorithm, which combines averaging in the sample space and along the time axes, emerges as the most significantly robust model for time-elastic averaging with a promising noise reduction capability. We also demonstrate its benefit in an isolated gesture recognition experiment and its ability to significantly reduce the size of training instance sets. Finally, we highlight its denoising capability using demonstrative synthetic data. Specifically, we show that it is possible to retrieve, from few noisy instances, a signal whose components are scattered in a wide spectral band.


2020 ◽  
Vol 9 (1) ◽  
pp. 1-16
Author(s):  
Ginga Yoshizawa

In time series data analysis, detecting change points on a real-time basis (online) is of great interest in many areas, such as finance, environmental monitoring, and medicine. One promising means to achieve this is the Bayesian online change point detection (BOCPD) algorithm, which has been successfully adopted in particular cases in which the time series of interest has a fixed baseline. However, we have found that the algorithm struggles when the baseline irreversibly shifts from its initial state. This is because with the original BOCPD algorithm, the sensitivity with which a change point can be detected is degraded if the data points are fluctuating at locations relatively far from the original baseline. In this paper, we not only extend the original BOCPD algorithm to be applicable to a time series whose baseline is constantly shifting toward unknown values but also visualize why the proposed extension works. To demonstrate the efficacy of the proposed algorithm compared to the original one, we examine these algorithms on two real-world data sets and six synthetic data sets.


Sign in / Sign up

Export Citation Format

Share Document