scholarly journals Extreme Events Characterization on Time Series

2020 ◽  
Author(s):  
Marcos Wander Rodrigues ◽  
Luis Enrique Zárate

The use of sensors in environments where they require constant monitoring has been increasing in recent years. The main goal is to guarantee the effectiveness, safety, and smooth functioning of the system. To identify the occurrence of abnormal events, we propose a methodology that aims to detect patterns that can lead to abrupt changes in the behavior of the sensor signals. To achieve this objective, we provide a strategy to characterize the time series, and we use a clustering technique to analyze the temporal evolution of the sensor system. To validate our methodology, we propose the clusters’ stability index by windowing. Also, we have developed a parameterizable time series generator, which allows us to represent different operational scenarios for a sensor system where extreme anomalies may arise.

2020 ◽  
Vol 143 (1-2) ◽  
pp. 447-460
Author(s):  
Leopoldo Carro-Calvo ◽  
Fernando Jaume-Santero ◽  
Ricardo García-Herrera ◽  
Sancho Salcedo-Sanz

AbstractIn this paper, we show a new clustering technique (k-gaps) aiming to generate a robust regionalization using sparse climate datasets with incomplete information in space and time. Hence, this method provides a new approach to cluster time series of different temporal lengths, using most of the information contained in heterogeneous sets of climate records that, otherwise, would be eliminated during data homogenization procedures. The robustness of the method has been validated with different synthetic datasets, demonstrating that k-gaps performs well with sample-starved datasets and missing climate information for at least 55% of the study period. We show that the algorithm is able to generate a climatically consistent regionalization based on temperature observations similar to those obtained with complete time series, outperforming other clustering methodologies developed to work with fragmentary information. k-Gaps clusters can therefore provide a useful framework for the study of long-term climate trends and the detection of past extreme events at regional scales.


2020 ◽  
Vol 72 (1) ◽  
Author(s):  
Masayuki Kano ◽  
Shin’ichi Miyazaki ◽  
Yoichi Ishikawa ◽  
Kazuro Hirahara

Abstract Postseismic Global Navigation Satellite System (GNSS) time series followed by megathrust earthquakes can be interpreted as a result of afterslip on the plate interface, especially in its early phase. Afterslip is a stress release process accumulated by adjacent coseismic slip and can be considered a recovery process for future events during earthquake cycles. Spatio-temporal evolution of afterslip often triggers subsequent earthquakes through stress perturbation. Therefore, it is important to quantitatively capture the spatio-temporal evolution of afterslip and related postseismic crustal deformation and to predict their future evolution with a physics-based simulation. We developed an adjoint data assimilation method, which directly assimilates GNSS time series into a physics-based model to optimize the frictional parameters that control the slip behavior on the fault. The developed method was validated with synthetic data. Through the optimization of frictional parameters, the spatial distributions of afterslip could roughly (but not in detail) be reproduced if the observation noise was included. The optimization of frictional parameters reproduced not only the postseismic displacements used for the assimilation, but also improved the prediction skill of the following time series. Then, we applied the developed method to the observed GNSS time series for the first 15 days following the 2003 Tokachi-oki earthquake. The frictional parameters in the afterslip regions were optimized to A–B ~ O(10 kPa), A ~ O(100 kPa), and L ~ O(10 mm). A large afterslip is inferred on the shallower side of the coseismic slip area. The optimized frictional parameters quantitatively predicted the postseismic GNSS time series for the following 15 days. These characteristics can also be detected if the simulation variables can be simultaneously optimized. The developed data assimilation method, which can be directly applied to GNSS time series following megathrust earthquakes, is an effective quantitative evaluation method for assessing risks of subsequent earthquakes and for monitoring the recovery process of megathrust earthquakes.


Water ◽  
2020 ◽  
Vol 12 (7) ◽  
pp. 2058 ◽  
Author(s):  
Larissa Rolim ◽  
Francisco de Souza Filho

Improved water resource management relies on accurate analyses of the past dynamics of hydrological variables. The presence of low-frequency structures in hydrologic time series is an important feature. It can modify the probability of extreme events occurring in different time scales, which makes the risk associated with extreme events dynamic, changing from one decade to another. This article proposes a methodology capable of dynamically detecting and predicting low-frequency streamflow (16–32 years), which presented significance in the wavelet power spectrum. The Standardized Runoff Index (SRI), the Pruned Exact Linear Time (PELT) algorithm, the breaks for additive seasonal and trend (BFAST) method, and the hidden Markov model (HMM) were used to identify the shifts in low frequency. The HMM was also used to forecast the low frequency. As part of the results, the regime shifts detected by the BFAST approach are not entirely consistent with results from the other methods. A common shift occurs in the mid-1980s and can be attributed to the construction of the reservoir. Climate variability modulates the streamflow low-frequency variability, and anthropogenic activities and climate change can modify this modulation. The identification of shifts reveals the impact of low frequency in the streamflow time series, showing that the low-frequency variability conditions the flows of a given year.


2021 ◽  
Vol 24 ◽  
pp. 100618
Author(s):  
Philipe Riskalla Leal ◽  
Ricardo José de Paula Souza e Guimarães ◽  
Fábio Dall Cortivo ◽  
Rayana Santos Araújo Palharini ◽  
Milton Kampel

2018 ◽  
Vol 31 (23) ◽  
pp. 9519-9543 ◽  
Author(s):  
Claudie Beaulieu ◽  
Rebecca Killick

The detection of climate change and its attribution to the corresponding underlying processes is challenging because signals such as trends and shifts are superposed on variability arising from the memory within the climate system. Statistical methods used to characterize change in time series must be flexible enough to distinguish these components. Here we propose an approach tailored to distinguish these different modes of change by fitting a series of models and selecting the most suitable one according to an information criterion. The models involve combinations of a constant mean or a trend superposed to a background of white noise with or without autocorrelation to characterize the memory, and are able to detect multiple changepoints in each model configuration. Through a simulation study on synthetic time series, the approach is shown to be effective in distinguishing abrupt changes from trends and memory by identifying the true number and timing of abrupt changes when they are present. Furthermore, the proposed method is better performing than two commonly used approaches for the detection of abrupt changes in climate time series. Using this approach, the so-called hiatus in recent global mean surface warming fails to be detected as a shift in the rate of temperature rise but is instead consistent with steady increase since the 1960s/1970s. Our method also supports the hypothesis that the Pacific decadal oscillation behaves as a short-memory process rather than forced mean shifts as previously suggested. These examples demonstrate the usefulness of the proposed approach for change detection and for avoiding the most pervasive types of mistake in the detection of climate change.


2017 ◽  
Vol 2 (3) ◽  
pp. 63-69 ◽  
Author(s):  
Ryan D. Batt ◽  
Stephen R. Carpenter ◽  
Anthony R. Ives

2016 ◽  
Vol 16 (4) ◽  
pp. 2299-2308 ◽  
Author(s):  
Chris M. Hall ◽  
Silje E. Holmen ◽  
Chris E. Meek ◽  
Alan H. Manson ◽  
Satonori Nozawa

Abstract. The turbopause is the demarcation between atmospheric mixing by turbulence (below) and molecular diffusion (above). When studying concentrations of trace species in the atmosphere, and particularly long-term change, it may be important to understand processes present, together with their temporal evolution that may be responsible for redistribution of atmospheric constituents. The general region of transition between turbulent and molecular mixing coincides with the base of the ionosphere, the lower region in which molecular oxygen is dissociated, and, at high latitude in summer, the coldest part of the whole atmosphere. This study updates previous reports of turbopause altitude, extending the time series by half a decade, and thus shedding new light on the nature of change over solar-cycle timescales. Assuming there is no trend in temperature, at 70° N there is evidence for a summer trend of  ∼  1.6 km decade−1, but for winter and at 52° N there is no significant evidence for change at all. If the temperature at 90 km is estimated using meteor trail data, it is possible to estimate a cooling rate, which, if applied to the turbopause altitude estimation, fails to alter the trend significantly irrespective of season. The observed increase in turbopause height supports a hypothesis of corresponding negative trends in atomic oxygen density, [O]. This supports independent studies of atomic oxygen density, [O], using mid-latitude time series dating from 1975, which show negative trends since 2002.


Sign in / Sign up

Export Citation Format

Share Document