scholarly journals TIME SERIES OUTLIER ANALYSIS FOR MODEL, DATA AND HUMAN-INDUCED RISKS IN COVID-19 SYMPTOMS DETECTION

2021 ◽  
Vol 7 (2) ◽  
pp. 123-136
Author(s):  
Ahmet KAYA ◽  
Rojan GÜMÜŞ ◽  
Ömer AYDIN
1998 ◽  
Vol 4 (3) ◽  
pp. 637-652 ◽  
Author(s):  
W.S. Chan ◽  
S. Wang

ABSTRACTA first order autoregressive model was proposed in Wilkie (1995) for the retail price inflation series as a part of his stochastic investment model. In this paper we apply time series outlier analysis to the data set and a revised model is derived. It significantly alleviates the problem of leptokurtic and positive skewed residual distribution as found in the original model. Finally, ARCH models for the original series and the outlier-adjusted data are also considered.


2015 ◽  
Vol 120 (9) ◽  
pp. 4057-4071 ◽  
Author(s):  
Dong Wang ◽  
Hao Ding ◽  
Vijay P. Singh ◽  
Xiaosan Shang ◽  
Dengfeng Liu ◽  
...  

2004 ◽  
Vol 43 (5) ◽  
pp. 727-738 ◽  
Author(s):  
Ralf Kretzschmar ◽  
Pierre Eckert ◽  
Daniel Cattani ◽  
Fritz Eggimann

Abstract This paper evaluates the quality of neural network classifiers for wind speed and wind gust prediction with prediction lead times between +1 and +24 h. The predictions were realized based on local time series and model data. The selection of appropriate input features was initiated by time series analysis and completed by empirical comparison of neural network classifiers trained on several choices of input features. The selected input features involved day time, yearday, features from a single wind observation device at the site of interest, and features derived from model data. The quality of the resulting classifiers was benchmarked against persistence for two different sites in Switzerland. The neural network classifiers exhibited superior quality when compared with persistence judged on a specific performance measure, hit and false-alarm rates.


Author(s):  
Rainer Metz

SummaryFollowing the influential work of Nelson and Plosser (1982) stochastic trends in macroeconomic time series are considered to be a stylized fact. However, since the stochastic trend hypothesis can be rejected for many economic series if a segmented trend model is considered as an alternative, there is at present no agreement on the proper modeling of national product. In this paper „Big“-shocks in the series of German Gross Domestic Product from 1850-1990 are modeled by means of an outlier analysis within the ARIMA-approach. Besides the identification and modeling of such outliers their impact on the trend and cycle component and especially on long term growth variations is investigated.


2004 ◽  
Vol 21 (12) ◽  
pp. 1876-1893 ◽  
Author(s):  
Charlie N. Barron ◽  
A. Birol Kara ◽  
Harley E. Hurlburt ◽  
C. Rowley ◽  
Lucy F. Smedstad

Abstract A ⅛° global version of the Navy Coastal Ocean Model (NCOM), operational at the Naval Oceanographic Office (NAVOCEANO), is used for prediction of sea surface height (SSH) on daily and monthly time scales during 1998–2001. Model simulations that use 3-hourly wind and thermal forcing obtained from the Navy Operational Global Atmospheric Prediction System (NOGAPS) are performed with/without data assimilation to examine indirect/direct effects of atmospheric forcing in predicting SSH. Model–data evaluations are performed using the extensive database of daily averaged SSH values from tide gauges in the Atlantic, Pacific, and Indian Oceans obtained from the Joint Archive for Sea Level (JASL) center during 1998–2001. Model–data comparisons are based on observations from 282 tide gauge locations. An inverse barometer correction was applied to SSH time series from tide gauges for model–data comparisons, and a sensitivity study is undertaken to assess the impact of the inverse barometer correction on the SSH validation. A set of statistical metrics that includes conditional bias (Bcond), root-mean-square (rms) difference, correlation coefficient (R), and nondimensional skill score (SS) is used to evaluate the model performance. It is shown that global NCOM has skill in representing SSH even in a free-running simulation, with general improvement when SSH from satellite altimetry and sea surface temperature (SST) from satellite IR are assimilated via synthetic temperature and salinity profiles derived from climatological correlations. When the model was run from 1998 to 2001 with NOGAPS forcing, daily model SSH comparisons from 612 yearlong daily tide gauge time series gave a median rms difference of 5.98 cm (5.77 cm), an R value of 0.72 (0.76), and an SS value of 0.45 (0.51) for the ⅛° free-running (assimilative) NCOM. Similarly, error statistics based on the 30-day running averages of SSH time series for 591 yearlong daily tide gauge time series over the time frame 1998–2001 give a median rms difference of 3.63 cm (3.36 cm), an R value of 0.83 (0.85), and an SS value of 0.60 (0.64) for the ⅛° free-running (assimilated) NCOM. Model– data comparisons show that skill in 30-day running average SSH time series is as much as 30% higher than skill for daily SSH. Finally, SSH predictions from the free-running and assimilative ⅛° NCOM simulations are validated against sea level data from the tide gauges in two different ways: 1) using original detided sea level time series from tide gauges and 2) using the detided data with an inverse barometer correction derived using daily mean sea level pressure extracted from NOGAPS at each location. Based on comparisons with 612 yearlong daily tide gauge time series during 1998–2001, the inverse barometer correction lowered the median rms difference by about 1 cm (15%–20%). Results presented in this paper reveal that NCOM is able to predict SSH with reasonable accuracies, as evidenced by model simulations performed during 1998–2001. In an extension of the validation over broader ocean regions, the authors find good agreement in amplitude and distribution of SSH variability between NCOM and other operational model products.


2019 ◽  
Vol 27 (16) ◽  
pp. A1225 ◽  
Author(s):  
Hao Yin ◽  
Youwen Sun ◽  
Cheng Liu ◽  
Lin Zhang ◽  
Xiao Lu ◽  
...  
Keyword(s):  

2021 ◽  
Author(s):  
Felix Kleinert ◽  
Lukas H. Leufen ◽  
Aurelia Lupascu ◽  
Tim Butler ◽  
Martin G. Schultz

<p>Machine learning techniques like deep learning gained enormous momentum in recent years. This was mainly caused by the success story of the main drivers like image and speech recognition, video prediction and autonomous driving, to name just a few.<br>Air pollutant forecasting models are an example, where earth system scientists start picking up deep learning models to enhance the forecast quality of time series. Almost all previous air pollution forecasts with machine learning rely solely on analysing temporal features in the observed time series of the target compound(s) and additional variables describing precursor concentrations and meteorological conditions. These studies, therefore, neglect the "chemical history" of air masses, i.e. the fact that air pollutant concentrations at a given observation site are a result of emission and sink processes, mixing and chemical transformations along the transport pathways of air.<br>This study develops a concept of how such factors can be represented in the recently published deep learning model IntelliO3. The concept is demonstrated with numerical model data from the WRF-Chem model because the gridded model data provides an internally consistent dataset with complete spatial coverage and no missing values.<br>Furthermore, using model data allows for attributing changes of the forecasting performance to specific conceptual aspects. For example, we use the 8 wind sectors (N, NE, E, SE, etc.) and circles with predefined radii around our target locations to aggregate meteorological and chemical data from the intersections. Afterwards, we feed this aggregated data into a deep neural network while using the ozone concentration of the central point's next timesteps as targets. By analysing the change of forecast quality when moving from 4-dimensional (x, y, z, t) to 3-dimensional (x, y, t or r, φ, t) sectors and thinning out the underlying model data, we can deliver first estimates of expected performance gains or losses when applying our concept to station based surface observations in future studies.</p>


Sign in / Sign up

Export Citation Format

Share Document