scholarly journals Insights from regional and short-term biodiversity monitoring datasets are valuable. A Reply to Daskalova et al. 2020 EcoEvoRxiv doi:10.32942/osf.io/cg3zs

2020 ◽  
Author(s):  
Sebastian Seibold ◽  
Torsten Hothorn ◽  
Martin M. Gossner ◽  
Nadja K. Siimons ◽  
Nico Blüthgen ◽  
...  

Reports of major losses in biodiversity have stimulated an increasing interest in temporal population changes, particularly in insects, which had received little attention in the past. Existing long-term datasets are often limited to a small number of study sites, few points in time, a narrow range of land-use intensities and only some taxonomic groups, or they lack standardized sampling. While new multi-site monitoring programs have been initiated, most of them still cover rather short time periods. Daskalova et al. 20201 argue that temporal trends of insect populations derived from short time series are biased towards extreme trends, while their own analysis of an assembly of shorter- and longer-term time series does not support an overall insect decline. With respect to the results of Seibold et al.2 based on a 10-year multi-site time series, they claim that the analysis suffers from not accounting for temporal pseudoreplication. In this note, we explain why the criticism of missing statistical rigour in the analysis of Seibold et al.2 is not warranted. Models that include ‘year’ as random effect, as suggested by Daskalova et al. 2020, fail to detect non-linear trends and assume that consecutive years are independent samples which is questionable for insect time-series data. We agree with Daskalova et al. 2020 that the assembly and analysis of larger datasets is urgently needed, but it will take time until such datasets are available. Thus, short-term datasets like ours are highly valuable, should be extended and analysed continually to provide a more detailed understanding of how insect populations are changing under the influence of global change, and to trigger immediate conservation actions.

Author(s):  
Pantelis Samartsidis ◽  
Natasha N. Martin ◽  
Victor De Gruttola ◽  
Frank De Vocht ◽  
Sharon Hutchinson ◽  
...  

Abstract Objectives The causal impact method (CIM) was recently introduced for evaluation of binary interventions using observational time-series data. The CIM is appealing for practical use as it can adjust for temporal trends and account for the potential of unobserved confounding. However, the method was initially developed for applications involving large datasets and hence its potential in small epidemiological studies is still unclear. Further, the effects that measurement error can have on the performance of the CIM have not been studied yet. The objective of this work is to investigate both of these open problems. Methods Motivated by an existing dataset of HCV surveillance in the UK, we perform simulation experiments to investigate the effect of several characteristics of the data on the performance of the CIM. Further, we quantify the effects of measurement error on the performance of the CIM and extend the method to deal with this problem. Results We identify multiple characteristics of the data that affect the ability of the CIM to detect an intervention effect including the length of time-series, the variability of the outcome and the degree of correlation between the outcome of the treated unit and the outcomes of controls. We show that measurement error can introduce biases in the estimated intervention effects and heavily reduce the power of the CIM. Using an extended CIM, some of these adverse effects can be mitigated. Conclusions The CIM can provide satisfactory power in public health interventions. The method may provide misleading results in the presence of measurement error.


2021 ◽  
Vol 10 (3) ◽  
pp. 134-143
Author(s):  
Annisa Yulianti ◽  
Hadi Sasana

 This study aims to analyze the short-term and long-term relationship of increasing the minimum wage in Central Java on employment. The research method used is ECM. The variables of this study include labor, minimum wages, PMDN, and economic growth. The data used are time-series data from 1990-2020. The results show that the minimum wage has a positive and significant relationship to the employment in the long term but not significantly in the short time. PMDN has a negative but significant correlation in the short and long term. At the same time, the variable economic growth has a positive but not meaningful relationship to employment absorption in the long and short term.


2012 ◽  
Vol 2012 ◽  
pp. 1-15 ◽  
Author(s):  
Jia Chaolong ◽  
Xu Weixiang ◽  
Wang Futian ◽  
Wang Hanning

The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM(1,1)is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.


2020 ◽  
Author(s):  
Hiroki Ogawa ◽  
Yuki Hama ◽  
Koichi Asamori ◽  
Takumi Ueda

Abstract In the magnetotelluric (MT) method, the responses of the natural electromagnetic fields are evaluated by transforming time-series data into spectral data and calculating the apparent resistivity and phase. The continuous wavelet transform (CWT) can be an alternative to the short-time Fourier transform, and the applicability of CWT to MT data has been reported. There are, however, few cases of considering the effect of numerical errors derived from spectral transform on MT data processing. In general, it is desirable to adopt a window function narrow in the time domain for higher-frequency components and one in the frequency domain for lower-frequency components. In conducting the short-time Fourier transform, because the size of the window function is fixed unless the time-series data are decimated, there might be difference between the calculated MT responses and the true ones due to the numerical errors. Meanwhile, CWT can strike a balance between the resolution of the time and frequency domains by magnifying or reducing the wavelet, according to the value of frequency. Although the types of wavelet functions and their parameters influence the resolution of time and frequency, those calculation settings of CWT are often determined empirically. In this study, focusing on the frequency band between 0.001 Hz and 10 Hz, we demonstrated the superiority of utilizing CWT in MT data processing and determined its proper calculation settings in terms of restraining the numerical errors caused by the spectral transform of time-series data. The results obtained with the short-time Fourier transform accompanied with gradual decimation of the time-series data, called cascade decimation, were compared with those of CWT. The shape of the wavelet was changed by using different types of wavelet functions or their parameters, and the respective results of data processing were compared. Through these experiments, this study indicates that CWT with the complex Morlet function with its wavelet parameter k set to 6 ≤ k < 10 will be effective in restraining the numerical errors caused by the spectral transform.


2018 ◽  
Vol 7 (4.15) ◽  
pp. 25 ◽  
Author(s):  
Said Jadid Abdulkadir ◽  
Hitham Alhussian ◽  
Muhammad Nazmi ◽  
Asim A Elsheikh

Forecasting time-series data are imperative especially when planning is required through modelling using uncertain knowledge of future events. Recurrent neural network models have been applied in the industry and outperform standard artificial neural networks in forecasting, but fail in long term time-series forecasting due to the vanishing gradient problem. This study offers a robust solution that can be implemented for long-term forecasting using a special architecture of recurrent neural network known as Long Short Term Memory (LSTM) model to overcome the vanishing gradient problem. LSTM is specially designed to avoid the long-term dependency problem as their default behavior. Empirical analysis is performed using quantitative forecasting metrics and comparative model performance on the forecasted outputs. An evaluation analysis is performed to validate that the LSTM model provides better forecasted outputs on Standard & Poor’s 500 Index (S&P 500) in terms of error metrics as compared to other forecasting models.  


Sign in / Sign up

Export Citation Format

Share Document