scholarly journals Pre-seismic Thermal Anomalies from Satellite Observations: A Review

2017 ◽  
Author(s):  
Zhong-Hu Jiao ◽  
Jing Zhao ◽  
Xinjian Shan

Abstract. Detecting thermal anomalies prior to strong earthquakes is a key in understanding and forecasting earthquake activities because of its recognition of thermal radiation-related phenomena in seismic preparation phases. Data from satellite observations serve as a powerful tool in monitoring earthquake preparation areas at a global scale and in a nearly real-time manner. Over the past several decades, many new different data sources have been utilized in this field, and progressive anomaly detection approaches have been developed. This paper dedicatedly reviews the progress and development of pre-seismic thermal anomaly detection technology in this decade. First, precursor parameters, including parameters from the top of the atmosphere, in the atmosphere, and on the Earth’s surface, are discussed. Second, different anomaly detection methods, which are used to extract thermal anomalous signals that probably indicate future seismic events, are presented. Finally, certain critical problems with the current research are highlighted, and new developing trends and perspectives for future work are discussed. The development of Earth observation satellites and anomaly detection algorithms can enrich available information sources, provide advanced tools for multilevel earthquake monitoring and improve short- and medium-term forecasting, which should play a large and growing role in pre-seismic thermal anomaly research.

2018 ◽  
Vol 18 (4) ◽  
pp. 1013-1036 ◽  
Author(s):  
Zhong-Hu Jiao ◽  
Jing Zhao ◽  
Xinjian Shan

Abstract. Detecting various anomalies using optical satellite data prior to strong earthquakes is key to understanding and forecasting earthquake activities because of its recognition of thermal-radiation-related phenomena in seismic preparation phases. Data from satellite observations serve as a powerful tool in monitoring earthquake preparation areas at a global scale and in a nearly real-time manner. Over the past several decades, many new different data sources have been utilized in this field, and progressive anomaly detection approaches have been developed. This paper reviews the progress and development of pre-seismic anomaly detection technology in this decade. First, precursor parameters, including parameters from the top of the atmosphere, in the atmosphere, and on the Earth's surface, are stated and discussed. Second, different anomaly detection methods, which are used to extract anomalous signals that probably indicate future seismic events, are presented. Finally, certain critical problems with the current research are highlighted, and new developing trends and perspectives for future work are discussed. The development of Earth observation satellites and anomaly detection algorithms can enrich available information sources, provide advanced tools for multilevel earthquake monitoring, and improve short- and medium-term forecasting, which play a large and growing role in pre-seismic anomaly detection research.


Author(s):  
Manish Marwah ◽  
Ratnesh K. Sharma ◽  
Wilfredo Lugo

In recent years, there has been a significant growth in number, size and power densities of data centers. A significant part of data center power consumption is attributed to the cooling infrastructure, consisting of computer air conditioning units (CRACs), chillers and cooling towers. For energy efficient operation and management of the cooling resources, data centers are beginning to be extensively instrumented with temperature sensors. While this allows cooling actuators, such as CRAC set point temperature, to be dynamically controlled and data centers operated at higher temperatures to save energy, it also increases chances of thermal anomalies. Furthermore, considering that large data centers can contain thousands to tens of thousands of such sensors, it is virtually impossible to manually inspect and analyze the large volumes of dynamic data generated by these sensors, thus necessitating autonomous mechanisms for thermal anomaly detection. Also, in addition to threshold-based detection methods, other mechanisms of anomaly detection are also necessary. In this paper, we describe the commonly occurring thermal anomalies in a data center. Furthermore, we describe — with examples from a production data center — techniques to autonomously detect these anomalies. In particular, we show the usefulness of a principal component analysis (PCA) based methodology to a large temperature sensor network. Specifically, we examine thermal anomalies such as those related to misconfiguration of equipment, blocked vent tiles, faulty sensor and CRAC related anomalies. Furthermore, several of these anomalies normally go undetected since no temperature thresholds are violated. We present examples of the thermal anomalies and their detection from a real data center.


2020 ◽  
Author(s):  
Arash Karimi Zarchi ◽  
Mohammad Reza Saradjian Maralan

Abstract. The recent scientific studies in the context of earthquake precursors reveal some processes connected to seismic activity including thermal anomaly before earthquakes which is a great help for making a better decision regarding this disastrous phenomenon and reducing its casualty to a minimum. This paper represents a method for grouping the proper input data for different thermal anomaly detection methods using the land surface temperature (LST) mean in multiple distances from the corresponding fault during the 40 days (i.e. 30 days before and 10 days after impending earthquake) of investigation. Six strong earthquakes with Ms > 6 that have occurred in Iran have been investigated in this study. We used two different approaches for detecting thermal anomalies. They are mean-standard deviation method also known as standard method and interquartile method which is similar to the first method but uses different parameters as input. Most of the studies have considered thermal anomalies around the known epicentre locations where the investigation can only be performed after the earthquake. This study is using fault distance-based approach in predicting the earthquake regarding the location of the faults as the potential area. This could be considered as an important step towards actual prediction of earthquake’s time and intensity. Results show that the proposed input data produces less false alarms in each of the thermal anomaly detection methods compared to the ordinary input data making this method much more accurate and stable considering the easy accessibility of thermal data and their less complicated algorithms for processing. In the final step, the detected anomalies are used for estimating earthquake intensity using Artificial Neural Network (ANN). The results show that estimated intensities of most earthquakes are very close to the actual intensities. Since the location of the active faults are known a priori, using fault distance-based approach may be regarded as a superior method in predicting the impending earthquakes for vulnerable faults. In spite of the previous investigations that the studies were only possible aftermath, the fault distance-based approach can be used as a tool for future unknown earthquakes prediction. However, it is recommended to use thermal anomaly detection as an initial process to be jointly used with other precursors to reduce the number of investigations that require more complicated algorithms and data processing.


Author(s):  
Yang Yuan ◽  
Eun Kyung Lee ◽  
Dario Pompili ◽  
Junbi Liao

The high density of servers in datacenters generates a large amount of heat, resulting in the high possibility of thermally anomalous events, i.e. computer room air conditioner fan failure, server fan failure, and workload misconfiguration. As such anomalous events increase the cost of maintaining computing and cooling components, they need to be detected, localized, and classified for taking appropriate remedial actions. In this article, a hierarchical neural network framework is proposed to detect small- (server level) and large-scale (datacenter level) thermal anomalies. This novel framework, which is organized into two tiers, analyzes the data sensed by heterogeneous sensors such as sensors built in the servers and external sensors (Telosb). The proposed solution employs a neural network to learn about (a) the relationship among sensing values (i.e. internal, external, and fan speed) and (b) the relationship between the sensing values and workload information. Then, the bottom tier of our framework detects thermal anomalies, whereas the top tier localizes and classifies them. Our solution outperforms other anomaly-detection methods based on regression model, support vector machine, and self-organizing map, as shown by the experimental results.


2016 ◽  
Author(s):  
Milan Flach ◽  
Fabian Gans ◽  
Alexander Brenning ◽  
Joachim Denzler ◽  
Markus Reichstein ◽  
...  

Abstract. Today, many processes at the Earth's surface are constantly monitored by multiple data streams. These observations have become central to advance our understanding of e.g. vegetation dynamics in response to climate or land use change. Another set of important applications is monitoring effects of climatic extreme events, other disturbances such as fires, or abrupt land transitions. One important methodological question is how to reliably detect anomalies in an automated and generic way within multivariate data streams, which typically vary seasonally and are interconnected across variables. Although many algorithms have been proposed for detecting anomalies in multivariate data, only few have been investigated in the context of Earth system science applications. In this study, we systematically combine and compare feature extraction and anomaly detection algorithms for detecting anomalous events. Our aim is to identify suitable workflows for automatically detecting anomalous patterns in multivariate Earth system data streams. We rely on artificial data that mimic typical properties and anomalies in multivariate spatiotemporal Earth observations. This artificial experiment is needed as there is no 'gold standard' for the identification of anomalies in real Earth observations. Our results show that a well chosen feature extraction step (e.g. subtracting seasonal cycles, or dimensionality reduction) is more important than the choice of a particular anomaly detection algorithm. Nevertheless, we identify 3 detection algorithms (k-nearest neighbours mean distance, kernel density estimation, a recurrence approach) and their combinations (ensembles) that outperform other multivariate approaches as well as univariate extreme event detection methods. Our results therefore provide an effective workflow to automatically detect anomalies in Earth system science data.


Anomaly detection has vital role in data preprocessing and also in the mining of outstanding points for marketing, network sensors, fraud detection, intrusion detection, stock market analysis. Recent studies have been found to concentrate more on outlier detection for real time datasets. Anomaly detection study is at present focuses on the expansion of innovative machine learning methods and on enhancing the computation time. Sentiment mining is the process to discover how people feel about a particular topic. Though many anomaly detection techniques have been proposed, it is also notable that the research focus lacks a comparative performance evaluation in sentiment mining datasets. In this study, three popular unsupervised anomaly detection algorithms such as density based, statistical based and cluster based anomaly detection methods are evaluated on movie review sentiment mining dataset. This paper will set a base for anomaly detection methods in sentiment mining research. The results show that density based (LOF) anomaly detection method suits best for the movie review sentiment dataset.


2014 ◽  
Vol 631-632 ◽  
pp. 631-635
Author(s):  
Yi Ting Wang ◽  
Shi Qi Huang ◽  
Hong Xia Wang ◽  
Dai Zhi Liu

Hyperspectral remote sensing technology can be used to make a correct spectral diagnosis on substances. So it is widely used in the field of target detection and recognition. However, it is very difficult to gather accurate prior information for target detect since the spectral uncertainty of objects is pervasive in existence. An anomaly detector can enable one to detect targets whose signatures are spectrally distinct from their surroundings with no prior knowledge. It becomes a focus in the field of target detection. Therefore, we study four anomaly detection algorithms and conclude with empirical results that use hyperspectral imaging data to illustrate the operation and performance of various detectors.


2019 ◽  
Vol 9 (19) ◽  
pp. 4018 ◽  
Author(s):  
Kim ◽  
Park ◽  
Kim ◽  
Cho ◽  
Kang

Insider threats are malicious activities by authorized users, such as theft of intellectual property or security information, fraud, and sabotage. Although the number of insider threats is much lower than external network attacks, insider threats can cause extensive damage. As insiders are very familiar with an organization’s system, it is very difficult to detect their malicious behavior. Traditional insider-threat detection methods focus on rule-based approaches built by domain experts, but they are neither flexible nor robust. In this paper, we propose insider-threat detection methods based on user behavior modeling and anomaly detection algorithms. Based on user log data, we constructed three types of datasets: user’s daily activity summary, e-mail contents topic distribution, and user’s weekly e-mail communication history. Then, we applied four anomaly detection algorithms and their combinations to detect malicious activities. Experimental results indicate that the proposed framework can work well for imbalanced datasets in which there are only a few insider threats and where no domain experts’ knowledge is provided.


2021 ◽  
Vol 72 ◽  
pp. 849-899
Author(s):  
Cynthia Freeman ◽  
Jonathan Merriman ◽  
Ian Beaver ◽  
Abdullah Mueen

The existence of an anomaly detection method that is optimal for all domains is a myth. Thus, there exists a plethora of anomaly detection methods which increases every year for a wide variety of domains. But a strength can also be a weakness; given this massive library of methods, how can one select the best method for their application? Current literature is focused on creating new anomaly detection methods or large frameworks for experimenting with multiple methods at the same time. However, and especially as the literature continues to expand, an extensive evaluation of every anomaly detection method is simply not feasible. To reduce this evaluation burden, we present guidelines to intelligently choose the optimal anomaly detection methods based on the characteristics the time series displays such as seasonality, trend, level change concept drift, and missing time steps. We provide a comprehensive experimental validation and survey of twelve anomaly detection methods over different time series characteristics to form guidelines based on several metrics: the AUC (Area Under the Curve), windowed F-score, and Numenta Anomaly Benchmark (NAB) scoring model. Applying our methodologies can save time and effort by surfacing the most promising anomaly detection methods instead of experimenting extensively with a rapidly expanding library of anomaly detection methods, especially in an online setting.


Author(s):  
A. Sledz ◽  
C. Heipke

Abstract. Thermal anomaly detection has an important role in remote sensing. One of the most widely used instruments for this task is a Thermal InfraRed (TIR) camera. In this work, thermal anomaly detection is formulated as a salient region detection, which is motivated by the assumption that a hot region often attracts attention of the human eye in thermal infrared images. Using TIR and optical images together, our working hypothesis is defined in the following manner: a hot region that appears as a salient region only in the TIR image and not in the optical image is a thermal anomaly. This work presents a two-step classification method for thermal anomaly detection based on an information fusion of saliency maps derived from both, TIR and optical images. Information fusion, based on the Dempster-Shafer evidence theory, is used in the first phase to find the location of regions suspected to be thermal anomalies. This classification problem is formulated as a multi-class problem and is carried out in an unsupervised manner on a pixel level. In the following phase, classification is formulated as a binary region-based problem in order to differentiate between normal temperature variations and thermal anomalies, while Random Forest (RF) is chosen as the classifier. In the seconds phase, the classification results from the previous phase are used as features along with temperature information and height details, which are obtained from a Digital Surface Model (DSM). We tested the approach using a dataset, which was collected from a UAV with TIR and optical cameras for monitoring District Heating Systems (DHS). Despite some limitations outlined in the paper, the presented innovative method to identify thermal anomalies has achieved up to 98.7 percent overall accuracy.


Sign in / Sign up

Export Citation Format

Share Document