Detection of Teleseismic Events in Seismic Sensor Data Using Nonlinear Dynamic Forecasting

1997 ◽  
Vol 07 (08) ◽  
pp. 1833-1845 ◽  
Author(s):  
Kevin M. Short

In this paper we consider the use of nonlinear dynamic (NLD) forecasting as a signal processing tool for seismic applications. The specific problem considered here arises in monitoring nuclear tests and nuclear treaty compliance, where the presence of ubiquitous background noise obscures the seismic signals associated with the tests. The problem is that the signal from a distant teleseismic event can be attenuated so that it is lost in the background noise, and since the noise overlaps the frequency band occupied by the teleseisms, frequency-based techniques provide only marginal improvements in detection capabilities. For the work in this paper, we studied a test set of actual seismic sensor data prepared by the Air Force Technical Applications Center (AFTAC). The data set was composed of background seismic noise which contained or had added to it a number of hidden teleseismic signals. This data was analyzed to determine if techniques of NLD forecasting could be used to detect the hidden signals. For this test case, it was possible to predict the behavior of the seismic background sufficiently well that when the predicted background behavior was removed, the hidden signals became evident. However, some of the weaker signals were very close to the residual noise level, so the ability to detect these events is compromised.

Author(s):  
Kyungkoo Jun

Background & Objective: This paper proposes a Fourier transform inspired method to classify human activities from time series sensor data. Methods: Our method begins by decomposing 1D input signal into 2D patterns, which is motivated by the Fourier conversion. The decomposition is helped by Long Short-Term Memory (LSTM) which captures the temporal dependency from the signal and then produces encoded sequences. The sequences, once arranged into the 2D array, can represent the fingerprints of the signals. The benefit of such transformation is that we can exploit the recent advances of the deep learning models for the image classification such as Convolutional Neural Network (CNN). Results: The proposed model, as a result, is the combination of LSTM and CNN. We evaluate the model over two data sets. For the first data set, which is more standardized than the other, our model outperforms previous works or at least equal. In the case of the second data set, we devise the schemes to generate training and testing data by changing the parameters of the window size, the sliding size, and the labeling scheme. Conclusion: The evaluation results show that the accuracy is over 95% for some cases. We also analyze the effect of the parameters on the performance.


2021 ◽  
pp. 158-166
Author(s):  
Noah Balestra ◽  
Gaurav Sharma ◽  
Linda M. Riek ◽  
Ania Busza

<b><i>Background:</i></b> Prior studies suggest that participation in rehabilitation exercises improves motor function poststroke; however, studies on optimal exercise dose and timing have been limited by the technical challenge of quantifying exercise activities over multiple days. <b><i>Objectives:</i></b> The objectives of this study were to assess the feasibility of using body-worn sensors to track rehabilitation exercises in the inpatient setting and investigate which recording parameters and data analysis strategies are sufficient for accurately identifying and counting exercise repetitions. <b><i>Methods:</i></b> MC10 BioStampRC® sensors were used to measure accelerometer and gyroscope data from upper extremities of healthy controls (<i>n</i> = 13) and individuals with upper extremity weakness due to recent stroke (<i>n</i> = 13) while the subjects performed 3 preselected arm exercises. Sensor data were then labeled by exercise type and this labeled data set was used to train a machine learning classification algorithm for identifying exercise type. The machine learning algorithm and a peak-finding algorithm were used to count exercise repetitions in non-labeled data sets. <b><i>Results:</i></b> We achieved a repetition counting accuracy of 95.6% overall, and 95.0% in patients with upper extremity weakness due to stroke when using both accelerometer and gyroscope data. Accuracy was decreased when using fewer sensors or using accelerometer data alone. <b><i>Conclusions:</i></b> Our exploratory study suggests that body-worn sensor systems are technically feasible, well tolerated in subjects with recent stroke, and may ultimately be useful for developing a system to measure total exercise “dose” in poststroke patients during clinical rehabilitation or clinical trials.


Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2532
Author(s):  
Encarna Quesada ◽  
Juan J. Cuadrado-Gallego ◽  
Miguel Ángel Patricio ◽  
Luis Usero

Anomaly Detection research is focused on the development and application of methods that allow for the identification of data that are different enough—compared with the rest of the data set that is being analyzed—and considered anomalies (or, as they are more commonly called, outliers). These values mainly originate from two sources: they may be errors introduced during the collection or handling of the data, or they can be correct, but very different from the rest of the values. It is essential to correctly identify each type as, in the first case, they must be removed from the data set but, in the second case, they must be carefully analyzed and taken into account. The correct selection and use of the model to be applied to a specific problem is fundamental for the success of the anomaly detection study and, in many cases, the use of only one model cannot provide sufficient results, which can be only reached by using a mixture model resulting from the integration of existing and/or ad hoc-developed models. This is the kind of model that is developed and applied to solve the problem presented in this paper. This study deals with the definition and application of an anomaly detection model that combines statistical models and a new method defined by the authors, the Local Transilience Outlier Identification Method, in order to improve the identification of outliers in the sensor-obtained values of variables that affect the operations of wind tunnels. The correct detection of outliers for the variables involved in wind tunnel operations is very important for the industrial ventilation systems industry, especially for vertical wind tunnels, which are used as training facilities for indoor skydiving, as the incorrect performance of such devices may put human lives at risk. In consequence, the use of the presented model for outlier detection may have a high impact in this industrial sector. In this research work, a proof-of-concept is carried out using data from a real installation, in order to test the proposed anomaly analysis method and its application to control the correct performance of wind tunnels.


AI ◽  
2021 ◽  
Vol 2 (1) ◽  
pp. 48-70
Author(s):  
Wei Ming Tan ◽  
T. Hui Teo

Prognostic techniques attempt to predict the Remaining Useful Life (RUL) of a subsystem or a component. Such techniques often use sensor data which are periodically measured and recorded into a time series data set. Such multivariate data sets form complex and non-linear inter-dependencies through recorded time steps and between sensors. Many current existing algorithms for prognostic purposes starts to explore Deep Neural Network (DNN) and its effectiveness in the field. Although Deep Learning (DL) techniques outperform the traditional prognostic algorithms, the networks are generally complex to deploy or train. This paper proposes a Multi-variable Time Series (MTS) focused approach to prognostics that implements a lightweight Convolutional Neural Network (CNN) with attention mechanism. The convolution filters work to extract the abstract temporal patterns from the multiple time series, while the attention mechanisms review the information across the time axis and select the relevant information. The results suggest that the proposed method not only produces a superior accuracy of RUL estimation but it also trains many folds faster than the reported works. The superiority of deploying the network is also demonstrated on a lightweight hardware platform by not just being much compact, but also more efficient for the resource restricted environment.


2020 ◽  
Vol 111 (9-10) ◽  
pp. 2891-2909
Author(s):  
Mahyar Khorasani ◽  
AmirHossein Ghasemi ◽  
Umar Shafique Awan ◽  
Elahe Hadavi ◽  
Martin Leary ◽  
...  

Abstract When reporting surface quality, the roughest surface is a reference for the measurements. In LPBF due to recoil pressure and scan movement, asymmetric surface is shaped, and surface roughness has different values in different measurement orientations. In this research, the influence of the laser powder bed fusion (LPBF) process parameters on surface tension and roughness of Ti-6AI-4 V parts in three orientations are investigated. To improve the mechanical properties, heat treatment was carried out and added to the designed matrix to generate a comprehensive data set. Taguchi design of experiment was employed to print 25 samples with five process parameters and post-processing. The effect and interaction of the parameters on the formation of surface profile comprising tension, morphology and roughness in various directions have been analysed. The main contribution of this paper is developing a model to approximate the melting pool temperature and surface tension based on the process parameters. Other contributions are an analysis of process parameters to determine the formation and variation of surface tension and roughness and explain the governing mechanisms through rheological phenomena. Results showed that the main driving factors in the variation of surface tension and formation of the surface profile are thermophysical properties of the feedstock, rheology and the temperature of the melting pool. Also, the results showed that while the value of surface tension is the same for each test case, morphology and the value of roughness are different when analysing the surface in perpendicular, parallel and angled directions to laser movement.


2021 ◽  
Author(s):  
Dat Q. Duong ◽  
Quang M. Le ◽  
Tan-Loc Nguyen-Tai ◽  
Hien D. Nguyen ◽  
Minh-Son Dao ◽  
...  

Accurately assessing the air quality index (AQI) values and levels has become an attractive research topic during the last decades. It is a crucial aspect when studying the possible adverse health effects associated with current air quality conditions. This paper aims to utilize machine learning and an appropriate selection of attributes for the air quality estimation problem using various features, including sensor data (humidity, temperature), timestamp features, location features, and public weather data. We evaluated the performance of different learning models and features to study the problem using the data set “MNR-HCM II”. The experimental results show that adopting TLPW features with Stacking generalization yields higher overall performance than other techniques and features in RMSE, accuracy, and F1-score.


2017 ◽  
Vol 13 (S338) ◽  
pp. 84-89
Author(s):  
Francesco Verrecchia ◽  
Marco Tavani ◽  
Immacolata Donnarumma

AbstractAGILE is a space mission of the Italian Space Agency dedicated to γ-ray astrophysics, launched in 2007. AGILE performed dedicated real-time searches for possible γ-ray counterparts of gravitational wave (GW) events detected by the LIGO-Virgo scientific Collaboration (LVC) during the O2 observation run. We present a review of AGILE observations of GW events, starting with the first, GW150914, which was a test case for future searches. We focus here on the main characteristics of the observations of the most important GW events detected in 2017, i.e. GW170104 and GW170817. In particular, for the former event we published γ-ray upper limits (ULs) in the 50 MeV – 10 GeV energy band together with a detailed analysis of a candidate precursor event in the Mini-Calorimeter data. As for GW170817, we published a set of constraining γ-ray ULs obtained for integrations preceding and following the event time. These results allow us to establish important constraints on the γ-ray emission from a possible magnetar-like remnant in the first ~1000 s following T0. AGILE is a major player in the search of electromagnetic counterparts of GW events, and its enhanced detection capabilities in hard X-ray/MeV/GeV ranges will play a crucial role in the future O3 observing run.


2016 ◽  
Vol 2016 (4) ◽  
pp. 21-36 ◽  
Author(s):  
Tao Wang ◽  
Ian Goldberg

Abstract Website fingerprinting allows a local, passive observer monitoring a web-browsing client’s encrypted channel to determine her web activity. Previous attacks have shown that website fingerprinting could be a threat to anonymity networks such as Tor under laboratory conditions. However, there are significant differences between laboratory conditions and realistic conditions. First, in laboratory tests we collect the training data set together with the testing data set, so the training data set is fresh, but an attacker may not be able to maintain a fresh data set. Second, laboratory packet sequences correspond to a single page each, but for realistic packet sequences the split between pages is not obvious. Third, packet sequences may include background noise from other types of web traffic. These differences adversely affect website fingerprinting under realistic conditions. In this paper, we tackle these three problems to bridge the gap between laboratory and realistic conditions for website fingerprinting. We show that we can maintain a fresh training set with minimal resources. We demonstrate several classification-based techniques that allow us to split full packet sequences effectively into sequences corresponding to a single page each. We describe several new algorithms for tackling background noise. With our techniques, we are able to build the first website fingerprinting system that can operate directly on packet sequences collected in the wild.


Sensors ◽  
2020 ◽  
Vol 20 (3) ◽  
pp. 879 ◽  
Author(s):  
Uwe Köckemann ◽  
Marjan Alirezaie ◽  
Jennifer Renoux ◽  
Nicolas Tsiftes ◽  
Mobyen Uddin Ahmed ◽  
...  

As research in smart homes and activity recognition is increasing, it is of ever increasing importance to have benchmarks systems and data upon which researchers can compare methods. While synthetic data can be useful for certain method developments, real data sets that are open and shared are equally as important. This paper presents the E-care@home system, its installation in a real home setting, and a series of data sets that were collected using the E-care@home system. Our first contribution, the E-care@home system, is a collection of software modules for data collection, labeling, and various reasoning tasks such as activity recognition, person counting, and configuration planning. It supports a heterogeneous set of sensors that can be extended easily and connects collected sensor data to higher-level Artificial Intelligence (AI) reasoning modules. Our second contribution is a series of open data sets which can be used to recognize activities of daily living. In addition to these data sets, we describe the technical infrastructure that we have developed to collect the data and the physical environment. Each data set is annotated with ground-truth information, making it relevant for researchers interested in benchmarking different algorithms for activity recognition.


Sensors ◽  
2020 ◽  
Vol 20 (23) ◽  
pp. 6894
Author(s):  
Nicola-Ann Stevens ◽  
Myra Lydon ◽  
Adele H. Marshall ◽  
Su Taylor

Machine learning and statistical approaches have transformed the management of infrastructure systems such as water, energy and modern transport networks. Artificial Intelligence-based solutions allow asset owners to predict future performance and optimize maintenance routines through the use of historic performance and real-time sensor data. The industrial adoption of such methods has been limited in the management of bridges within aging transport networks. Predictive maintenance at bridge network level is particularly complex due to the considerable level of heterogeneity encompassed across various bridge types and functions. This paper reviews some of the main approaches in bridge predictive maintenance modeling and outlines the challenges in their adaptation to the future network-wide management of bridges. Survival analysis techniques have been successfully applied to predict outcomes from a homogenous data set, such as bridge deck condition. This paper considers the complexities of European road networks in terms of bridge type, function and age to present a novel application of survival analysis based on sparse data obtained from visual inspections. This research is focused on analyzing existing inspection information to establish data foundations, which will pave the way for big data utilization, and inform on key performance indicators for future network-wide structural health monitoring.


Sign in / Sign up

Export Citation Format

Share Document