scholarly journals Time series homogenization with the ACMANT software

2021 ◽  
Author(s):  
Peter Domonkos

<p>The development of ACMANT homogenization software started during the European COST HOME project, around 2010. Due to its excellent results in method comparison tests, the development of ACMANT has been being continuous since then. While its first version was applicable only to the homogenization of monthly temperature series, the later versions are applicable to a wide range of climatic variables and either for monthly or daily time series.</p><p>The operation of ACMANT is fast and automatic, and it is easy to use that even for large size datasets. The method can homogenize together time series of varied lengths, well tolerate data gaps, includes outlier filtering and infilling of data gaps (optional). ACMANT includes modern and effective statistical tools for the detection and removal of inhomogenities, such as step function fitting, bivariate detection for breaks of annual means and seasonal amplitudes (where applicable), ANOVA correction method and ensemble homogenization with varied pre-homogenization of neighbour series. For these properties, ACMANTv4 was the most accurate homogenization method in most method comparison tests of the Spanish MULTITEST project (https://doi.org/10.1175/JCLI-D-20-0611.1). In these tests, one important exception occurred, namely network mean trend errors were removed with significantly higher certainty by the Pairwise Homogenization Algorithm when approximately a half of the time series were affected with quasi synchronous breaks imitating concerted technical changes in the performance of climate observations. The most recent developments aiming the release of ACMANTv5 include the elimination of this drawback of ACMANT.</p><p>For ACMANTv5, a new break detection method has been developed, in which the combination of two time series comparison methods is applied. The new method contains both the use of composite reference series and pairwise comparisons, and in the detection with composite reference series the step function fitting is forced to include the breaks detected by pairwise comparisons. Another novelty of ACMANTv5 is that it gives options to use metadata in the homogenization procedure. The default operation mode of ACMANTv5 is still fully automatic, with or without the automatic use of a prepared metadata table. ACMANTv5 uses every date of the metadata list as a break indicator, and they are evaluated together with other indicators obtained by pairwise comparisons. Optionally, ACMANTv5 gives access to users to edit the list of detected breaks based on the pairwise detections of the first homogenization round. In the later steps of ACMANTv5 user intervention is not possible, but metadata may be considered by the automatic procedure also in the final estimation of break positions.     </p>

2021 ◽  
Vol 13 (9) ◽  
pp. 1743
Author(s):  
Daniel Paluba ◽  
Josef Laštovička ◽  
Antonios Mouratidis ◽  
Přemysl Štych

This study deals with a local incidence angle correction method, i.e., the land cover-specific local incidence angle correction (LC-SLIAC), based on the linear relationship between the backscatter values and the local incidence angle (LIA) for a given land cover type in the monitored area. Using the combination of CORINE Land Cover and Hansen et al.’s Global Forest Change databases, a wide range of different LIAs for a specific forest type can be generated for each scene. The algorithm was developed and tested in the cloud-based platform Google Earth Engine (GEE) using Sentinel-1 open access data, Shuttle Radar Topography Mission (SRTM) digital elevation model, and CORINE Land Cover and Hansen et al.’s Global Forest Change databases. The developed method was created primarily for time-series analyses of forests in mountainous areas. LC-SLIAC was tested in 16 study areas over several protected areas in Central Europe. The results after correction by LC-SLIAC showed a reduction of variance and range of backscatter values. Statistically significant reduction in variance (of more than 40%) was achieved in areas with LIA range >50° and LIA interquartile range (IQR) >12°, while in areas with low LIA range and LIA IQR, the decrease in variance was very low and statistically not significant. Six case studies with different LIA ranges were further analyzed in pre- and post-correction time series. Time-series after the correction showed a reduced fluctuation of backscatter values caused by different LIAs in each acquisition path. This reduction was statistically significant (with up to 95% reduction of variance) in areas with a difference in LIA greater than or equal to 27°. LC-SLIAC is freely available on GitHub and GEE, making the method accessible to the wide remote sensing community.


Author(s):  
Peter Domonkos

The removal of non-climatic biases, so-called inhomogeneities, from long climatic records needs sophistically developed statistical methods. One principle is that usually the differences between a candidate series and its neighbour series are analysed instead of directly the candidate series, in order to neutralize the possible impacts of regionally common natural climate variation on the detection of inhomogeneities. In most homogenization methods, two main kinds of time series comparisons are applied, i.e. composite reference series or pairwise comparisons. In composite reference series the inhomogeneities of neighbour series are attenuated by averaging the individual series, and the accuracy of homogenization can be improved by the iterative improvement of composite reference series. By contrast, pairwise comparisons have the advantage that coincidental inhomogeneities affecting several station series in a similar way can be identified with higher certainty than with composite reference series. In addition, homogenization with pairwise comparisons tends to facilitate the most accurate regional trend estimations. A new time series comparison method is presented here, which combines the use of pairwise comparisons and composite reference series in a way that their advantages are unified. This time series comparison method is embedded into the ACMANT homogenization method, and tested in large, commonly available monthly temperature test datasets.


Atmosphere ◽  
2021 ◽  
Vol 12 (9) ◽  
pp. 1134
Author(s):  
Peter Domonkos

The removal of non-climatic biases, so-called inhomogeneities, from long climatic records needs sophistically developed statistical methods. One principle is that the differences between a candidate series and its neighbor series are usually analyzed instead of the candidate series directly, in order to neutralize the possible impacts of regionally common natural climate variation on the detection of inhomogeneities. In most homogenization methods, two main kinds of time series comparisons are applied, i.e., composite reference series or pairwise comparisons. In composite reference series, the inhomogeneities of neighbor series are attenuated by averaging the individual series, and the accuracy of homogenization can be improved by the iterative improvement of composite reference series. By contrast, pairwise comparisons have the advantage that coincidental inhomogeneities affecting several station series in a similar way can be identified with higher certainty than with composite reference series. In addition, homogenization with pairwise comparisons tends to facilitate the most accurate regional trend estimations. A new time series comparison method is presented here, which combines the use of pairwise comparisons and composite reference series in a way that their advantages are unified. This time series comparison method is embedded into the Applied Caussinus-Mestre Algorithm for homogenizing Networks of climatic Time series (ACMANT) homogenization method, and tested in large, commonly available monthly temperature test datasets. Further favorable characteristics of ACMANT are also discussed.


The linear electrical properties of muscle fibres have been examined using intracellular electrodes for a. c. measurements and analyzing observations on the basis of cable theory. The measurements have covered the frequency range 1 c/s to 10 kc/s. Comparison of the theory for the circular cylindrical fibre with that for the ideal, one-dimensional cable indicates that, under the conditions of the experiments, no serious error would be introduced in the analysis by the geometrical idealization. The impedance locus for frog sartorius and crayfish limb muscle fibres deviates over a wide range of frequencies from that expected for a simple model in which the current path between the inside and the outside of the fibre consists only of a resistance and a capacitance in parallel. A good fit of the experimental results on frog fibres is obtained if the inside-outside admittance is considered to contain, in addition to the parallel elements R m = 3100 Ωcm 2 and C m = 2.6 μF/cm 2 , another path composed of a resistance R e = 330 Ωcm 2 in series with a capacitance C e = 4.1 μF/cm 2 , all referred to unit area of fibre surface. The impedance behaviour of crayfish fibres can be described by a similar model, the corresponding values being R m = 680 Ωcm 2 , C m = 3.9 μF/cm 2 , R e = 35 Ωcm 2 , C e = 17 μF/cm 2 . The response of frog fibres to a step-function current (with the points of voltage recording and current application close together) has been analyzed in terms of the above two-time constant model, and it is shown that neglecting the series resistance would have an appreciable effect on the agreement between theory and experiment only at times less than the halftime of rise of the response. The elements R m and C m are presumed to represent properties of the surface membrane of the fibre. R e and C e are thought to arise not at the surface, but to be indicative of a separate current path from the myoplasm through an intracellular system of channels to the exterior. In the case of crayfish fibres, it is possible that R e (when referred to unit volume) would be a measure of the resistivity of the interior of the channels, and C e the capacitance across the walls of the channels. In the case of frog fibres, it is suggested that the elements R e , C e arise from the properties of adjacent membranes of the triads in the sarcoplasmic reticulum . The possibility is considered that the potential difference across the capacitance C e may control the initiation of contraction.


2021 ◽  
Vol 13 (16) ◽  
pp. 3069
Author(s):  
Yadong Liu ◽  
Junhwan Kim ◽  
David H. Fleisher ◽  
Kwang Soo Kim

Seasonal forecasts of crop yield are important components for agricultural policy decisions and farmer planning. A wide range of input data are often needed to forecast crop yield in a region where sophisticated approaches such as machine learning and process-based models are used. This requires considerable effort for data preparation in addition to identifying data sources. Here, we propose a simpler approach called the Analogy Based Crop-yield (ABC) forecast scheme to make timely and accurate prediction of regional crop yield using a minimum set of inputs. In the ABC method, a growing season from a prior long-term period, e.g., 10 years, is first identified as analogous to the current season by the use of a similarity index based on the time series leaf area index (LAI) patterns. Crop yield in the given growing season is then forecasted using the weighted yield average reported in the analogous seasons for the area of interest. The ABC approach was used to predict corn and soybean yields in the Midwestern U.S. at the county level for the period of 2017–2019. The MOD15A2H, which is a satellite data product for LAI, was used to compile inputs. The mean absolute percentage error (MAPE) of crop yield forecasts was <10% for corn and soybean in each growing season when the time series of LAI from the day of year 89 to 209 was used as inputs to the ABC approach. The prediction error for the ABC approach was comparable to results from a deep neural network model that relied on soil and weather data as well as satellite data in a previous study. These results indicate that the ABC approach allowed for crop yield forecast with a lead-time of at least two months before harvest. In particular, the ABC scheme would be useful for regions where crop yield forecasts are limited by availability of reliable environmental data.


2019 ◽  
Vol 12 (11) ◽  
pp. 4661-4679 ◽  
Author(s):  
Bin Cao ◽  
Xiaojing Quan ◽  
Nicholas Brown ◽  
Emilie Stewart-Jones ◽  
Stephan Gruber

Abstract. Simulations of land-surface processes and phenomena often require driving time series of meteorological variables. Corresponding observations, however, are unavailable in most locations, even more so, when considering the duration, continuity and data quality required. Atmospheric reanalyses provide global coverage of relevant meteorological variables, but their use is largely restricted to grid-based studies. This is because technical challenges limit the ease with which reanalysis data can be applied to models at the site scale. We present the software toolkit GlobSim, which automates the downloading, interpolation and scaling of different reanalyses – currently ERA5, ERA-Interim, JRA-55 and MERRA-2 – to produce meteorological time series for user-defined point locations. The resulting data have consistent structure and units to efficiently support ensemble simulation. The utility of GlobSim is demonstrated using an application in permafrost research. We perform ensemble simulations of ground-surface temperature for 10 terrain types in a remote tundra area in northern Canada and compare the results with observations. Simulation results reproduced seasonal cycles and variation between terrain types well, demonstrating that GlobSim can support efficient land-surface simulations. Ensemble means often yielded better accuracy than individual simulations and ensemble ranges additionally provide indications of uncertainty arising from uncertain input. By improving the usability of reanalyses for research requiring time series of climate variables for point locations, GlobSim can enable a wide range of simulation studies and model evaluations that previously were impeded by technical hurdles in obtaining suitable data.


2017 ◽  
Vol 54 (1) ◽  
pp. 203-236 ◽  
Author(s):  
Yan Zhu ◽  
Zachary Zimmerman ◽  
Nader Shakibay Senobari ◽  
Chin-Chia Michael Yeh ◽  
Gareth Funning ◽  
...  

2018 ◽  
Vol 22 (2) ◽  
pp. 1175-1192 ◽  
Author(s):  
Qian Zhang ◽  
Ciaran J. Harman ◽  
James W. Kirchner

Abstract. River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1) fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2) the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling – in the form of spectral slope (β) or other equivalent scaling parameters (e.g., Hurst exponent) – are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1) they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β  =  0) to Brown noise (β  =  2) and (2) their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths) in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb–Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among all methods for a wide range of prescribed β values and gap distributions. The aliasing method, however, does not itself account for sampling irregularity, and this introduces some bias in the result. Nonetheless, the wavelet method is recommended for estimating β in irregular time series until improved methods are developed. Finally, all methods' performances depend strongly on the sampling irregularity, highlighting that the accuracy and precision of each method are data specific. Accurately quantifying the strength of fractal scaling in irregular water-quality time series remains an unresolved challenge for the hydrologic community and for other disciplines that must grapple with irregular sampling.


2021 ◽  
Vol 7 ◽  
pp. e744
Author(s):  
Si Thu Aung ◽  
Yodchanan Wongsawat

Epilepsy is a common neurological disease that affects a wide range of the world population and is not limited by age. Moreover, seizures can occur anytime and anywhere because of the sudden abnormal discharge of brain neurons, leading to malfunction. The seizures of approximately 30% of epilepsy patients cannot be treated with medicines or surgery; hence these patients would benefit from a seizure prediction system to live normal lives. Thus, a system that can predict a seizure before its onset could improve not only these patients’ social lives but also their safety. Numerous seizure prediction methods have already been proposed, but the performance measures of these methods are still inadequate for a complete prediction system. Here, a seizure prediction system is proposed by exploring the advantages of multivariate entropy, which can reflect the complexity of multivariate time series over multiple scales (frequencies), called multivariate multiscale modified-distribution entropy (MM-mDistEn), with an artificial neural network (ANN). The phase-space reconstruction and estimation of the probability density between vectors provide hidden complex information. The multivariate time series property of MM-mDistEn provides more understandable information within the multichannel data and makes it possible to predict of epilepsy. Moreover, the proposed method was tested with two different analyses: simulation data analysis proves that the proposed method has strong consistency over the different parameter selections, and the results from experimental data analysis showed that the proposed entropy combined with an ANN obtains performance measures of 98.66% accuracy, 91.82% sensitivity, 99.11% specificity, and 0.84 area under the curve (AUC) value. In addition, the seizure alarm system was applied as a postprocessing step for prediction purposes, and a false alarm rate of 0.014 per hour and an average prediction time of 26.73 min before seizure onset were achieved by the proposed method. Thus, the proposed entropy as a feature extraction method combined with an ANN can predict the ictal state of epilepsy, and the results show great potential for all epilepsy patients.


2019 ◽  
Vol 5 (01) ◽  
pp. 47-54
Author(s):  
Wigid Hariadi

Abstract. Intervention analysis is used to evaluate the effect of external events on a time series data. Sea-highway program is one of the leading programs Joko Widodo-Jusuf Kalla in presidential election 2014. So the author want to modeling the effect from Sea-highway programs on stock price movement in the shipping sector, TMAS.JK (Pelayaran Tempuran Emas tbk). After analyzing, proven that it has happened intervention on movement of daily stock price TMAS.JK caused by Sea-highway programs. Intervention I, on 11 August 2014, which was efect as a result of the election of the Joko Widodo-Jusuf kalla pair as President and vice President Republic of Indonesia on 22 july 2014. Intervention II, on 10 november 2014, president Joko Widodo speech in APEC about Sea-highway Program, and offering investment in port construction to foreign country. So that the model of time series analysis that right is intervention analysis model multi input step function, where the model is ARIMA (2,1,0), StepI (b=0, s=2, r=1), StepII (b=3, s=0, r=1).  Keywords: Intervention Analysis, Multi Input, Step Function, Sea-highway.    Abstrak. Analisis intervensi digunakan untuk mengevaluasi efek dari peristiwa eksternal pada suatu data time series. Program Tol-Laut merupakan salah satu program unggulan pasangan Joko Widodo-Jusuf Kalla dalam pemilu 2014. sehingga, penulis ingin memodelkan efek dari Program Tol-Laut terhadap pergerakan harga saham dibidang pelayaran, TMAS.JK (Pelayaran Tempuran Emas tbk). Setelah dilakukan analisis data, terbukti bahwa terjadi intervensi pada pergerakan harga saham harian TMAS.JK yang disebabkan oleh efek dari program Tol-Laut. Dimana intervensi I, pada tanggal 11 Agustus 2014, yang diduga sebagai dampak dari terpilihnya pasangan Joko widodo-Jusuf Kalla sebagai presiden dan wakil presiden Republik Indonesia pada tanggal 22 Juli 2014. Intervensi II, pada tanggal 10 November 2014, pidato Presiden Joko Widodo di forum APEC mengenai program  tol  laut, dan  menawarkan investasi dibidang pembangunan pelabuhan  kepada bangsa asing. Sehingga model analisis time series yang tepat adalah model analisis intervensi multi input fungsi step, dimana modelnya adalah ARIMA (2,1,0), StepI (b=0, s=2, r=1), StepII (b=3, s=0, r=1). Kata kunci: Analisis intervensi, Multi Input, fungsi step, Tol-Laut.


Sign in / Sign up

Export Citation Format

Share Document