scholarly journals Accounting for Autocorrelation in Detecting Mean Shifts in Climate Data Series Using the Penalized Maximal t or F Test

2008 ◽  
Vol 47 (9) ◽  
pp. 2423-2444 ◽  
Author(s):  
Xiaolan L. Wang

Abstract This study proposes an empirical approach to account for lag-1 autocorrelation in detecting mean shifts in time series of white or red (first-order autoregressive) Gaussian noise using the penalized maximal t test or the penalized maximal F test. This empirical approach is embedded in a stepwise testing algorithm, so that the new algorithms can be used to detect single or multiple changepoints in a time series. The detection power of the new algorithms is analyzed through Monte Carlo simulations. It has been shown that the new algorithms work very well and fast in detecting single or multiple changepoints. Examples of their application to real climate data series (surface pressure and wind speed) are presented. An open-source software package (in R and FORTRAN) for implementing the algorithms, along with a user manual, has been developed and made available online free of charge.

2018 ◽  
Vol 10 (1) ◽  
pp. 181-196 ◽  
Author(s):  
Mehdi Bahrami ◽  
Samira Bazrkar ◽  
Abdol Rassoul Zarei

Abstract Drought as an exigent natural phenomenon, with high frequency in arid and semi-arid regions, leads to enormous damage to agriculture, economy, and environment. In this study, the seasonal Standardized Precipitation Index (SPI) drought index and time series models were employed to model and predict seasonal drought using climate data of 38 Iranian synoptic stations during 1967–2014. In order to model and predict seasonal drought ITSM (Interactive Time Series Modeling) statistical software was used. According to the calculated seasonal SPI, within the study area, drought severity classes 4 and 3 had the greatest occurrence frequency, while classes 6 and 7 had the least occurrence frequency. Results indicated that the best fitted models were Moving-Average or MA (5) Innovations and MA (5) Hannan-Rissenen, with 60.53 and 15.79 percentage, respectively. On the other hand, results of the prediction as well, indicated that drought class 4 with the highest percentages, was the most abundant class over the study area and drought class 7 was the least frequent class. According to results of trend analysis, without attention to significance of them, observed seasonal SPI data series (1967–2014), in 84.21% of synoptic stations had a negative trend, but this percentage changes to 86.84% when studying the combination of observed and predicted simultaneously (1967–2019).


2008 ◽  
Vol 25 (3) ◽  
pp. 368-384 ◽  
Author(s):  
Xiaolan L. Wang

Abstract In this study, a penalized maximal F test (PMFT) is proposed for detecting undocumented mean shifts that are not accompanied by any sudden change in the linear trend of time series. PMFT aims to even out the uneven distribution of false alarm rate and detection power of the corresponding unpenalized maximal F test that is based on a common-trend two-phase regression model (TPR3). The performance of PMFT is compared with that of TPR3 using Monte Carlo simulations and real climate data series. It is shown that, due to the effect of unequal sample sizes, the false alarm rate of TPR3 has a W-shaped distribution, with much higher than specified values for points near the ends of the series and lower values for points between either of the ends and the middle of the series. Consequently, for a mean shift of certain magnitude, TPR3 would detect it with a lower-than-specified level of confidence and hence more easily when it occurs near the ends of the series than somewhere between either of the ends and the middle of the series; it would mistakenly declare many more changepoints near the ends of a homogeneous series. These undesirable features of TPR3 are diminished in PMFT by using an empirical penalty function to take into account the relative position of each point being tested. As a result, PMFT has a notably higher power of detection; its false alarm rate and effective level of confidence are very close to the nominal level, basically evenly distributed across all possible candidate changepoints. The improvement in hit rate can be more than 10% for detecting small shifts (Δ ≤ σ, where σ is the noise standard deviation).


2007 ◽  
Vol 46 (6) ◽  
pp. 916-931 ◽  
Author(s):  
Xiaolan L. Wang ◽  
Qiuzi H. Wen ◽  
Yuehua Wu

Abstract In this paper, a penalized maximal t test (PMT) is proposed for detecting undocumented mean shifts in climate data series. PMT takes the relative position of each candidate changepoint into account, to diminish the effect of unequal sample sizes on the power of detection. Monte Carlo simulation studies are conducted to evaluate the performance of PMT, in comparison with the most popularly used method, the standard normal homogeneity test (SNHT). An application of the two methods to atmospheric pressure series recorded at a Canadian site is also presented. It is shown that the false-alarm rate of PMT is very close to the specified level of significance and is evenly distributed across all candidate changepoints, whereas that of SNHT can be up to 10 times the specified level for points near the ends of series and much lower for the middle points. In comparison with SNHT, therefore, PMT has higher power for detecting all changepoints that are not too close to the ends of series and lower power for detecting changepoints that are near the ends of series. On average, however, PMT has significantly higher power of detection. The smaller the shift magnitude Δ is relative to the noise standard deviation σ, the greater is the improvement of PMT over SNHT. The improvement in hit rate can be as much as 14%–25% for detecting small shifts (Δ < σ) regardless of time series length and up to 5% for detecting medium shifts (Δ = σ–1.5σ) in time series of length N < 100. For all detectable shift sizes, the largest improvement is always obtained when N < 100, which is of great practical importance, because most annual climate data series are of length N < 100.


2020 ◽  
Author(s):  
Jonathan Sanching Tsay ◽  
Alan S. Lee ◽  
Guy Avraham ◽  
Darius E. Parvin ◽  
Jeremy Ho ◽  
...  

Motor learning experiments are typically run in-person, exploiting finely calibrated setups (digitizing tablets, robotic manipulandum, full VR displays) that provide high temporal and spatial resolution. However, these experiments come at a cost, not limited to the one-time expense of purchasing equipment but also the substantial time devoted to recruiting participants and administering the experiment. Moreover, exceptional circumstances that limit in-person testing, such as a global pandemic, may halt research progress. These limitations of in-person motor learning research have motivated the design of OnPoint, an open-source software package for motor control and motor learning researchers. As with all online studies, OnPoint offers an opportunity to conduct large-N motor learning studies, with potential applications to do faster pilot testing, replicate previous findings, and conduct longitudinal studies (GitHub repository: https://github.com/alan-s-lee/OnPoint).


2021 ◽  
Vol 10 (4) ◽  
pp. 208
Author(s):  
Christoph Traun ◽  
Manuela Larissa Schreyer ◽  
Gudrun Wallentin

Time series animation of choropleth maps easily exceeds our perceptual limits. In this empirical research, we investigate the effect of local outlier preserving value generalization of animated choropleth maps on the ability to detect general trends and local deviations thereof. Comparing generalization in space, in time, and in a combination of both dimensions, value smoothing based on a first order spatial neighborhood facilitated the detection of local outliers best, followed by the spatiotemporal and temporal generalization variants. We did not find any evidence that value generalization helps in detecting global trends.


Water ◽  
2021 ◽  
Vol 13 (13) ◽  
pp. 1723
Author(s):  
Ana Gonzalez-Nicolas ◽  
Marc Schwientek ◽  
Michael Sinsbeck ◽  
Wolfgang Nowak

Currently, the export regime of a catchment is often characterized by the relationship between compound concentration and discharge in the catchment outlet or, more specifically, by the regression slope in log-concentrations versus log-discharge plots. However, the scattered points in these plots usually do not follow a plain linear regression representation because of different processes (e.g., hysteresis effects). This work proposes a simple stochastic time-series model for simulating compound concentrations in a river based on river discharge. Our model has an explicit transition parameter that can morph the model between chemostatic behavior and chemodynamic behavior. As opposed to the typically used linear regression approach, our model has an additional parameter to account for hysteresis by including correlation over time. We demonstrate the advantages of our model using a high-frequency data series of nitrate concentrations collected with in situ analyzers in a catchment in Germany. Furthermore, we identify event-based optimal scheduling rules for sampling strategies. Overall, our results show that (i) our model is much more robust for estimating the export regime than the usually used regression approach, and (ii) sampling strategies based on extreme events (including both high and low discharge rates) are key to reducing the prediction uncertainty of the catchment behavior. Thus, the results of this study can help characterize the export regime of a catchment and manage water pollution in rivers at lower monitoring costs.


2021 ◽  
Vol 13 (2) ◽  
pp. 205
Author(s):  
Philipp Hochreuther ◽  
Niklas Neckel ◽  
Nathalie Reimann ◽  
Angelika Humbert ◽  
Matthias Braun

The usability of multispectral satellite data for detecting and monitoring supraglacial meltwater ponds has been demonstrated for western Greenland. For a multitemporal analysis of large regions or entire Greenland, largely automated processing routines are required. Here, we present a sequence of algorithms that allow for an automated Sentinel-2 data search, download, processing, and generation of a consistent and dense melt pond area time-series based on open-source software. We test our approach for a ~82,000 km2 area at the 79°N Glacier (Nioghalvfjerdsbrae) in northeast Greenland, covering the years 2016, 2017, 2018 and 2019. Our lake detection is based on the ratio of the blue and red visible bands using a minimum threshold. To remove false classification caused by the similar spectra of shadow and water on ice, we implement a shadow model to mask out topographically induced artifacts. We identified 880 individual lakes, traceable over 479 time-steps throughout 2016–2019, with an average size of 64,212 m2. Of the four years, 2019 had the most extensive lake area coverage with a maximum of 333 km2 and a maximum individual lake size of 30 km2. With 1.5 days average observation interval, our time-series allows for a comparison with climate data of daily resolution, enabling a better understanding of short-term climate-glacier feedbacks.


2019 ◽  
Vol 11 (7) ◽  
pp. 866 ◽  
Author(s):  
Imke Hans ◽  
Martin Burgdorf ◽  
Stefan A. Buehler

Understanding the causes of inter-satellite biases in climate data records from observations of the Earth is crucial for constructing a consistent time series of the essential climate variables. In this article, we analyse the strong scan- and time-dependent biases observed for the microwave humidity sounders on board the NOAA-16 and NOAA-19 satellites. We find compelling evidence that radio frequency interference (RFI) is the cause of the biases. We also devise a correction scheme for the raw count signals for the instruments to mitigate the effect of RFI. Our results show that the RFI-corrected, recalibrated data exhibit distinctly reduced biases and provide consistent time series.


Water ◽  
2021 ◽  
Vol 13 (1) ◽  
pp. 95
Author(s):  
Yilinuer Alifujiang ◽  
Jilili Abuduwaili ◽  
Yongxiao Ge

This study investigated the temporal patterns of annual and seasonal river runoff data at 13 hydrological stations in the Lake Issyk-Kul basin, Central Asia. The temporal trends were analyzed using the innovative trend analysis (ITA) method with significance testing. The ITA method results were compared with the Mann-Kendall (MK) trend test at a 95% confidence level. The comparison results revealed that the ITA method could effectively identify the trends detected by the MK trend test. Specifically, the MK test found that the time series percentage decreased from 46.15% in the north to 25.64% in the south, while the ITA method revealed a similar rate of decrease, from 39.2% to 29.4%. According to the temporal distribution of the MK test, significantly increasing (decreasing) trends were observed in 5 (0), 6 (2), 4 (3), 8 (0), and 8 (1) time series in annual, spring, summer, autumn, and winter river runoff data. At the same time, the ITA method detected significant trends in 7 (1), 9 (3), 6(3), 9 (3), and 8 (2) time series in the study area. As for the ITA method, the “peak” values of 24 time series (26.97%) exhibited increasing patterns, 25 time series (28.09%) displayed increasing patterns for “low” values, and 40 time series (44.94%) showed increasing patterns for “medium” values. According to the “low”, “medium”, and “peak” values, five time series (33.33%), seven time series (46.67%), and three time series (20%) manifested decreasing trends, respectively. These results detailed the patterns of annual and seasonal river runoff data series by evaluating “low”, “medium”, and “peak” values.


Sign in / Sign up

Export Citation Format

Share Document