Optimal reorientation of geophysical sensors: A quaternion-based analytical solution

Geophysics ◽  
2015 ◽  
Vol 80 (2) ◽  
pp. F19-F30 ◽  
Author(s):  
Lars Krieger ◽  
Francesco Grigoli

One of the most critical problems affecting geophysical data acquisition procedures is related to the misorientation of multicomponent sensors with respect to a common reference system (e.g., geographic north). In many applications, misoriented sensors affect data analysis procedures, leading to errors in results and interpretations. These problems generally occur in applications where the orientation of the sensor cannot be actively controlled and is not known a priori, e.g., geophysical sensors deployed in borehole installations or on the seafloor. We have developed a quaternion-based method for the optimal reorientation of multicomponent geophysical sensors. In contrast to other approaches, we took into account the full time-series record from all sensor components. Therefore, our method could be applied to all time-series data and was not restricted to a certain type of geophysical sensor. Our method allows the robust calculation of relative reorientations between two-component or three-component sensors. By using a reference sensor in an iterative process, this result can be extended to the estimation of absolute sensor orientations. In addition to finding an optimal solution for a full 3D sensor rotation, we have established a rigorous scheme for the estimation of uncertainties of the resulting orientation parameters. We tested the feasibility and applicability of our method using synthetic data examples for a vertical seismic profile and an ocean bottom seismometer array. We noted that the quaternion-based reorientation method is superior to the standard approach of a single-parameter estimation of rotation angles.

2001 ◽  
Vol 17 (2) ◽  
pp. 424-450 ◽  
Author(s):  
Duo Qin ◽  
Christopher L. Gilbert

We argue that many methodological confusions in time-series econometrics may be seen as arising out of ambivalence or confusion about the error terms. Relationships between macroeconomic time series are inexact, and, inevitably, the early econometricians found that any estimated relationship would only fit with errors. Slutsky interpreted these errors as shocks that constitute the motive force behind business cycles. Frisch tried to dissect the errors further into two parts: stimuli, which are analogous to shocks, and nuisance aberrations. However, he failed to provide a statistical framework to make this distinction operational. Haavelmo, and subsequent researchers at the Cowles Commission, saw errors in equations as providing the statistical foundations for econometric models and required that they conform to a priori distributional assumptions specified in structural models of the general equilibrium type, later known as simultaneous-equations models. Because theoretical models were at that time mostly static, the structural modeling strategy relegated the dynamics in time-series data frequently to nuisance, atheoretical complications. Revival of the shock interpretation in theoretical models came about through the rational expectations movement and development of the vector autoregression modeling approach. The so-called London School of Economics dynamic specification approach decomposes the dynamics of the modeled variable into three parts: short-run shocks, disequilibrium shocks, and innovative residuals, with only the first two of these sustaining an economic interpretation.


Author(s):  
Qianguang Lin ◽  
Ni Li ◽  
Qi Qi ◽  
Jiabin Hu

Internet of Things (IoT) devices built on different processor architectures have increasingly become targets of adversarial attacks. In this paper, we propose an algorithm for the malware classification problem of the IoT domain to deal with the increasingly severe IoT security threats. Application executions are represented by sequences of consecutive API calls. The time series of data is analyzed and filtered based on the improved information gains. It performs more effectively than chi-square statistics, in reducing the sequence lengths of input data meanwhile keeping the important information, according to the experimental results. We use a multi-layer convolutional neural network to classify various types of malwares, which is suitable for processing time series data. When the convolution window slides down the time sequence, it can obtain higher-level positions by collecting different sequence features, thereby understanding the characteristics of the corresponding sequence position. By comparing the iterative efficiency of different optimization algorithms in the model, we select an algorithm that can approximate the optimal solution to a small number of iterations to speed up the convergence of the model training. The experimental results from real world IoT malware sample show that the classification accuracy of this approach can reach more than 98%. Overall, our method has demonstrated practical suitability for IoT malware classification with high accuracies and low computational overheads by undergoing a comprehensive evaluation.


Author(s):  
Hiroyuki Moriguchi ◽  
◽  
Ichiro Takeuchi ◽  
Masayuki Karasuyama ◽  
Shin-ichi Horikawa ◽  
...  

In this paper, we study a problem of anomaly detection from time series-data. We use kernel quantile regression (KQR) to predict the extreme (such as 0.01 or 0.99) quantiles of the future time-series data distribution. It enables us to tell whether the probability of observing a certain time-series sequence is larger than, say, 1 percent or not. In this paper, we develop an efficient update algorithm of KQR in order to adapt the KQR in on-line manner. We propose a new algorithm that allows us to compute the optimal solution of the KQR when a new training pattern is inserted or deleted. We demonstrate the effectiveness of our methodology through numerical experiment using real-world time-series data.


2018 ◽  
Author(s):  
Elijah Bogart ◽  
Richard Creswell ◽  
Georg K. Gerber

AbstractLongitudinal studies are crucial for discovering casual relationships between the microbiome and human disease. We present Microbiome Interpretable Temporal Rule Engine (MITRE), the first machine learning method specifically designed for predicting host status from microbiome time-series data. Our method maintains interpretability by learning predictive rules over automatically inferred time-periods and phylogenetically related microbes. We validate MITRE’s performance on semi-synthetic data, and five real datasets measuring microbiome composition over time in infant and adult cohorts. Our results demonstrate that MITRE performs on par or outperforms “black box” machine learning approaches, providing a powerful new tool enabling discovery of biologically interpretable relationships between microbiome and human host.


2020 ◽  
Vol 35 (5) ◽  
pp. 439-451 ◽  
Author(s):  
Elan Ness-Cohn ◽  
Marta Iwanaszko ◽  
William L. Kath ◽  
Ravi Allada ◽  
Rosemary Braun

The circadian rhythm drives the oscillatory expression of thousands of genes across all tissues, coordinating physiological processes. The effect of this rhythm on health has generated increasing interest in discovering genes under circadian control by searching for periodic patterns in transcriptomic time-series experiments. While algorithms for detecting cycling transcripts have advanced, there remains little guidance quantifying the effect of experimental design and analysis choices on cycling detection accuracy. We present TimeTrial, a user-friendly benchmarking framework using both real and synthetic data to investigate cycle detection algorithms’ performance and improve circadian experimental design. Results show that the optimal choice of analysis method depends on the sampling scheme, noise level, and shape of the waveform of interest and provides guidance on the impact of sampling frequency and duration on cycling detection accuracy. The TimeTrial software is freely available for download and may also be accessed through a web interface. By supplying a tool to vary and optimize experimental design considerations, TimeTrial will enhance circadian transcriptomics studies.


2021 ◽  
pp. 1-20
Author(s):  
Fabian Kai-Dietrich Noering ◽  
Yannik Schroeder ◽  
Konstantin Jonas ◽  
Frank Klawonn

In technical systems the analysis of similar situations is a promising technique to gain information about the system’s state, its health or wearing. Very often, situations cannot be defined but need to be discovered as recurrent patterns within time series data of the system under consideration. This paper addresses the assessment of different approaches to discover frequent variable-length patterns in time series. Because of the success of artificial neural networks (NN) in various research fields, a special issue of this work is the applicability of NNs to the problem of pattern discovery in time series. Therefore we applied and adapted a Convolutional Autoencoder and compared it to classical nonlearning approaches based on Dynamic Time Warping, based on time series discretization as well as based on the Matrix Profile. These nonlearning approaches have also been adapted, to fulfill our requirements like the discovery of potentially time scaled patterns from noisy time series. We showed the performance (quality, computing time, effort of parametrization) of those approaches in an extensive test with synthetic data sets. Additionally the transferability to other data sets is tested by using real life vehicle data. We demonstrated the ability of Convolutional Autoencoders to discover patterns in an unsupervised way. Furthermore the tests showed, that the Autoencoder is able to discover patterns with a similar quality like classical nonlearning approaches.


2019 ◽  
Vol 11 (24) ◽  
pp. 2956
Author(s):  
Marcos C. Hott ◽  
Luis M. T. Carvalho ◽  
Mauro A. H. Antunes ◽  
João C. Resende ◽  
Wadson S. D. Rocha

There is currently a lot of interest in determining the state of Brazilian grasslands. Governmental actions and programs have recently been implemented for grassland recovery in Brazilian states, with the aim of improving production systems and socioeconomic indicators. The aim of this study is to evaluate the vegetative growth, temporal vigor, and long-term scenarios for the grasslands in Zona da Mata, Minas Gerais State, Brazil, by integrating phenological metrics. We used metrics derived from the normalized difference vegetation index (NDVI) time series from moderate resolution imaging spectroradiometer (MODIS) data, which were analyzed in a geographic information system (GIS), using multicriteria analysis, the analytical hierarchy process, and a simplified expert system (ESS). These temporal metrics, i.e., the growth index (GI) for 16-day periods during the growing season; the slope; and the maximum, minimum, and mean for the time series, were integrated to investigate the grassland vegetation conditions and degradation level. The temporal vegetative vigor was successfully described using the rescaled range (R/S statistic) and the Hurst exponent, which, together with the metrics estimated for the full time series, imagery, and field observations, indicated areas undergoing degradation or areas that were inadequately managed (approximately 61.5%). Time series analysis revealed that most grasslands showed low or moderate vegetative vigor over time with long-term persistence due to farming practices associated with burning and overgrazing. A small part of the grasslands showed high and sustainable plant densities (approximately 8.5%). A map legend for grassland management guidelines was developed using the proposed method with remote sensing data, which were applied using GIS software and a field campaign.


2020 ◽  
Vol 4 ◽  
pp. 27
Author(s):  
Daniel M. Weinberger ◽  
Joshua L. Warren

When evaluating the effects of vaccination programs, it is common to estimate changes in rates of disease before and after vaccine introduction. There are a number of related approaches that attempt to adjust for trends unrelated to the vaccine and to detect changes that coincide with introduction. However, characteristics of the data can influence the ability to estimate such a change. These include, but are not limited to, the number of years of available data prior to vaccine introduction, the expected strength of the effect of the intervention, the strength of underlying secular trends, and the amount of unexplained variability in the data. Sources of unexplained variability include model misspecification, epidemics due to unidentified pathogens, and changes in ascertainment or coding practice among others. In this study, we present a simple simulation framework for estimating the power to detect a decline and the precision of these estimates. We use real-world data from a pre-vaccine period to generate simulated time series where the vaccine effect is specified a priori. We present an interactive web-based tool to implement this approach. We also demonstrate the use of this approach using observed data on pneumonia hospitalization from the states in Brazil from a period prior to introduction of pneumococcal vaccines to generate the simulated time series. We relate the power of the hypothesis tests to the number of cases per year and the amount of unexplained variability in the data and demonstrate how fewer years of data influence the results.


Geophysics ◽  
1983 ◽  
Vol 48 (2) ◽  
pp. 229-233 ◽  
Author(s):  
G. Jayachandran Nair

The purpose of deconvolution in seismology is to estimate the basic seismic wavelet and the transfer function of the transmission medium. In reflection seismology, this transfer function refers to the reflectivity function, while in seismograms of earthquakes and explosions it represents the combined effects of the source crust and the receiver crust responses along with the attenuation function. Some of the techniques used for deconvolution of discrete time series data are Wiener inverse filtering (Robinson and Treitel, 1967), homomorphic deconvolution (Ulrych, 1971), and Kalman filtering (Crump, 1974). In the present paper, a method of deconvolution of single‐channel seismic data based on an autoregressive (AR) model of the time series is discussed. With it one can estimate the primary pulse and the deconvolution function simultaneously in an objective manner. Examples are provided to substantiate the applicability of the method using synthetic data simulating single and multiple explosions. The method is also applied to actual data for a presumed underground explosion from Eastern Kazakh.


2019 ◽  
Vol 29 (2) ◽  
pp. 375-392 ◽  
Author(s):  
Pierre-Francois Marteau

Abstract In the light of regularized dynamic time warping kernels, this paper re-considers the concept of a time elastic centroid for a set of time series. We derive a new algorithm based on a probabilistic interpretation of kernel alignment matrices. This algorithm expresses the averaging process in terms of stochastic alignment automata. It uses an iterative agglomerative heuristic method for averaging the aligned samples, while also averaging the times of their occurrence. By comparing classification accuracies for 45 heterogeneous time series data sets obtained by first nearest centroid/medoid classifiers, we show that (i) centroid-based approaches significantly outperform medoid-based ones, (ii) for the data sets considered, our algorithm, which combines averaging in the sample space and along the time axes, emerges as the most significantly robust model for time-elastic averaging with a promising noise reduction capability. We also demonstrate its benefit in an isolated gesture recognition experiment and its ability to significantly reduce the size of training instance sets. Finally, we highlight its denoising capability using demonstrative synthetic data. Specifically, we show that it is possible to retrieve, from few noisy instances, a signal whose components are scattered in a wide spectral band.


Sign in / Sign up

Export Citation Format

Share Document