climate time series
Recently Published Documents


TOTAL DOCUMENTS

72
(FIVE YEARS 25)

H-INDEX

17
(FIVE YEARS 2)

Author(s):  
Jennifer L. Castle ◽  
David F. Hendry

Shared features of economic and climate time series imply that tools for empirically modeling nonstationary economic outcomes are also appropriate for studying many aspects of observational climate-change data. Greenhouse gas emissions, such as carbon dioxide, nitrous oxide, and methane, are a major cause of climate change as they cumulate in the atmosphere and reradiate the sun’s energy. As these emissions are currently mainly due to economic activity, economic and climate time series have commonalities, including considerable inertia, stochastic trends, and distributional shifts, and hence the same econometric modeling approaches can be applied to analyze both phenomena. Moreover, both disciplines lack complete knowledge of their respective data-generating processes (DGPs), so model search retaining viable theory but allowing for shifting distributions is important. Reliable modeling of both climate and economic-related time series requires finding an unknown DGP (or close approximation thereto) to represent multivariate evolving processes subject to abrupt shifts. Consequently, to ensure that DGP is nested within a much larger set of candidate determinants, model formulations to search over should comprise all potentially relevant variables, their dynamics, indicators for perturbing outliers, shifts, trend breaks, and nonlinear functions, while retaining well-established theoretical insights. Econometric modeling of climate-change data requires a sufficiently general model selection approach to handle all these aspects. Machine learning with multipath block searches commencing from very general specifications, usually with more candidate explanatory variables than observations, to discover well-specified and undominated models of the nonstationary processes under analysis, offers a rigorous route to analyzing such complex data. To do so requires applying appropriate indicator saturation estimators (ISEs), a class that includes impulse indicators for outliers, step indicators for location shifts, multiplicative indicators for parameter changes, and trend indicators for trend breaks. All ISEs entail more candidate variables than observations, often by a large margin when implementing combinations, yet can detect the impacts of shifts and policy interventions to avoid nonconstant parameters in models, as well as improve forecasts. To characterize nonstationary observational data, one must handle all substantively relevant features jointly: A failure to do so leads to nonconstant and mis-specified models and hence incorrect theory evaluation and policy analyses.


2021 ◽  
Author(s):  
James Ricketts ◽  
Roger Jones

This paper applies misspecification (M-S) testing to the detection of abrupt changes in climate regimes as part of undertaking severe testing of climate shifts versus trends. Severe testing, proposed by Mayo and Spanos, provides severity criteria for evaluating statistical inference using probative criteria, requiring tests that would find any flaws present. Applying M-S testing increases the severity of hypothesis testing. We utilize a systematic approach, based on well-founded principles that combines the development of probative criteria with error statistical testing. Given the widespread acceptance of trend-like change in climate, especially temperature, tests that produce counter-examples need proper specification. Reasoning about abrupt shifts embedded within a complex times series requires detection methods sensitive to level changes, accurate in timing, and tolerant of simultaneous changes of trend, variance, autocorrelation, and red-drift, given that many of these measures may shift together. Our preference is to analyse the raw data to avoid pre-emptive assumptions and test the results for robustness. We use a simple detection method, based on the Maronna-Yohai (MY) test, then re-assess nominated shift-points using tests with varied null hypotheses guided by M-S testing. Doing so sharpens conclusions while avoiding an over-reliance on data manipulation, which carries its own assumptions.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Gen Li ◽  
Jason J. Jung

AbstractAbnormal climate event is that some meteorological conditions are extreme in a certain time interval. The existing methods for detecting abnormal climate events utilize supervised learning models to learn the abnormal patterns, but they cannot detect the untrained patterns. To overcome this problem, we construct a dynamic graph by discovering the correlation among the climate time series and propose a novel dynamic graph embedding model based on graph entropy called EDynGE to discriminate anomalies. The graph entropy measurement quantifies the information of the graphs and constructs the embedding space. We conducted experiments on synthetic datasets and real-world meteorological datasets. The results showed that EdynGE model achieved a better F1-score than the baselines by 43.2%, and the number of days of abnormal climate events has increased by 304.5 days in the past 30 years.


Atmosphere ◽  
2021 ◽  
Vol 12 (7) ◽  
pp. 820
Author(s):  
Jonathan Woody ◽  
Yang Xu ◽  
Jamie Dyer ◽  
Robert Lund ◽  
Anuradha P. Hewaarachchi

Several attempts to assess regional snow depth trends have been previously made. These studies estimate trends by applying various statistical methods to snow depths, new snowfalls, or their climatological proxies such as snow water equivalents. In most of these studies, inhomogeneities (changepoints) were not accounted for in the analysis. Changepoint features can dramatically influence trend inferences from climate time series. The purpose of this paper is to present a detailed statistical methodology to estimate trends of a time series of daily snow depths that account for changepoint features. The methods are illustrated in the analysis of a daily snow depth data set from North America.


2021 ◽  
Vol 9 ◽  
Author(s):  
Julia Eis ◽  
Larissa van der Laan ◽  
Fabien Maussion ◽  
Ben Marzeion

Estimations of global glacier mass changes over the course of the 20th century require automated initialization methods, allowing the reconstruction of past glacier states from limited information. In a previous study, we developed a method to initialize the Open Global Glacier Model (OGGM) from past climate information and present-day geometry alone. Tested in an idealized framework, this method aimed to quantify how much information present-day glacier geometry carries about past glacier states. The method was not applied to real-world cases, and therefore, the results were not comparable with observations. This study closes the gap to real-world cases by introducing a glacier-specific calibration of the mass balance model. This procedure ensures that the modeled present-day geometry matches the observed area and that the past glacier evolution is consistent with bias-corrected past climate time series. We apply the method to 517 glaciers, spread globally, for which either mass balance observations or length records are available, and compare the observations to the modeled reconstructed glacier changes. For the validation of the initialization method, we use multiple measures of reconstruction skill (e.g., MBE, RMSE, and correlation). We find that the modeled mass balances and glacier lengths are in good agreement with the observations, especially for glaciers with many observation years. These results open the door to a future global application.


2021 ◽  
Author(s):  
Christian Jaedicke ◽  
Dieter Issler ◽  
Kjersti Gleditsch Gisnås ◽  
Sean Salazar ◽  
Kate Robinson ◽  
...  

<div> <p>Snow avalanches are a significant natural hazard and common phenomenon in Norway. Applied research on avalanches and their societal impact has been conducted at the Norwegian Geotechnical Institute (NGI) for nearly half a century.</p> <p>Recent activities within the applied avalanche research group at NGI have focused on four areas: (1) Improved understanding of is sought through the application of simple probabilistic release models and local wind modelling. Encouraging results are obtained by analysing and refining publicly available climate time series for temperature, snow depth and precipitation on a 1 km² grid. A major remaining challenge in view of elaborating realistic large-area avalanche hazard indication maps is the a priori determination of the size of release areas as a function of return period. (2) Different aspects of are investigated by means of a wide array of experimental technologies at the Ryggfonn full-scale test site, application of aerial survey methods to derive snow distribution, and investigation of the scaling behaviour of avalanches with extreme runouts in many different paths. The results of all these analyses point towards the need for a departure from modelling avalanches with Voellmy-type models in favour of models encompassing multiple flow regimes, a more realistic rheology and entrainment as well as deposition. (3) To improve risk assessment and mitigation measures, with structures are studied by documenting destructive avalanche events, constructing vulnerability curves for persons inside buildings based on historic avalanche events, improving methods for evaluation of individual risks, and development of criteria for physical mitigation measures against powder-snow avalanches. (4) Current efforts in focus on the one hand on simple block models for studying scaling behaviour on idealised and natural slopes and on the other hand on an advanced multi-flow-regime model that also incorporates different effects of the snow cover. Ongoing work aims, among others, at an entrainment and deposition model that is dynamically consistent and only depends on measurable snow properties. This contribution will present an overview of recent activities and advancements in applied avalanche research in Norway. It is hoped that it will serve to facilitate future international collaborative efforts to address challenges in applied avalanche research.</p> </div>


2021 ◽  
Author(s):  
Jakob Runge ◽  
Andreas Gerhardus

<p>Discovering causal dependencies from observational time series datasets is a major problem in better understanding the complex dynamical system Earth. Recent methodological advances have addressed major challenges such as high-dimensionality and nonlinearity (PCMCI, Runge et al. Sci. Adv. 2019), as well as instantaneous causal links (PCMCI+, Runge UAI, 2020) and hidden variables (LPCMCI, Gerhardus and Runge, 2020), but many more remain. In this presentation I will give an overview of challenges and methods and present a recent approach, Ensemble-PCMCI, to analyze ensembles of climate time series. An example for this are initialized ensemble forecasts. Since the individual samples can then be created from several time series instead of different time steps from a single time series, such cases allow to relax the assumption of stationarity and hence to analyze whether and how the underlying causal relationships change over time. We compare Ensemble-PCMCI to other methods and discuss preliminary applications.</p><p>Runge et al., Detecting and quantifying causal associations in large nonlinear time series datasets, Science Advances eeaau4996 (2019).</p><p>Runge, J. Discovering contemporaneous and lagged causal relations in autocorrelated nonlinear time series datasets. Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence, UAI 2020, Toronto, Canada, 2019, AUAI Press, 2020</p><p>Gerhardus, A. & Runge, J. High-recall causal discovery for autocorrelated time series with latent confounders. Advances in Neural Information Processing Systems, 2020, 33</p>


2021 ◽  
Author(s):  
Luca Margaritella ◽  
Marina Friedrich ◽  
Stephan Smeekes

<div> <div> <div> <p>We use the framework of Granger-causality testing in high-dimensional vector autoregressive models (VARs) to disentangle and interpret the complex causal chains linking radiative forcings and global as well as hemispheric temperatures. By allowing for high dimensionality in the model we can enrich the information set with all relevant natural and anthropogenic forcing variables to obtain reliable causal relations. These variables have mostly been investigated in an aggregated form or in separate models in the previous literature. An additional advantage of our framework is that it allows to ignore the order of integration of the variables and to directly estimate the VAR in levels, therefore avoiding accumulating biases coming from unit-root and cointegration tests. This is of particular appeal for climate time series which are often argued to contain specific stochastic trends as well as yielding long memory. We are thus able to display the causal networks linking radiative forcings to global and hemispheric temperatures but also to causally connect radiative forcings among themselves, therefore allowing for a careful reconstruction of a timeline of causal effects among forcings. The robustness of our proposed procedure makes it an important tool for policy evaluation in tackling global climate change.</p> </div> </div> </div>


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Josué M. Polanco-Martínez ◽  
Javier Fernández-Macho ◽  
Martín Medina-Elizalde

AbstractThe wavelet local multiple correlation (WLMC) is introduced for the first time in the study of climate dynamics inferred from multivariate climate time series. To exemplify the use of WLMC with real climate data, we analyse Last Millennium (LM) relationships among several large-scale reconstructed climate variables characterizing North Atlantic: i.e. sea surface temperatures (SST) from the tropical cyclone main developmental region (MDR), the El Niño-Southern Oscillation (ENSO), the North Atlantic Multidecadal Oscillation (AMO), and tropical cyclone counts (TC). We examine the former three large-scale variables because they are known to influence North Atlantic tropical cyclone activity and because their underlying drivers are still under investigation. WLMC results obtained for these multivariate climate time series suggest that: (1) MDRSST and AMO show the highest correlation with each other and with respect to the TC record over the last millennium, and: (2) MDRSST is the dominant climate variable that explains TC temporal variability. WLMC results confirm that this method is able to capture the most fundamental information contained in multivariate climate time series and is suitable to investigate correlation among climate time series in a multivariate context.


Sign in / Sign up

Export Citation Format

Share Document