Multisite stochastic weather generation using cluster analysis and k-nearest neighbor time series resampling

2014 ◽  
Vol 508 ◽  
pp. 197-213 ◽  
Author(s):  
Nina Marie Caraway ◽  
James Lucian McCreight ◽  
Balaji Rajagopalan
2015 ◽  
pp. 125-138 ◽  
Author(s):  
I. V. Goncharenko

In this article we proposed a new method of non-hierarchical cluster analysis using k-nearest-neighbor graph and discussed it with respect to vegetation classification. The method of k-nearest neighbor (k-NN) classification was originally developed in 1951 (Fix, Hodges, 1951). Later a term “k-NN graph” and a few algorithms of k-NN clustering appeared (Cover, Hart, 1967; Brito et al., 1997). In biology k-NN is used in analysis of protein structures and genome sequences. Most of k-NN clustering algorithms build «excessive» graph firstly, so called hypergraph, and then truncate it to subgraphs, just partitioning and coarsening hypergraph. We developed other strategy, the “upward” clustering in forming (assembling consequentially) one cluster after the other. Until today graph-based cluster analysis has not been considered concerning classification of vegetation datasets.


Mathematics ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 413 ◽  
Author(s):  
Chris Lytridis ◽  
Anna Lekova ◽  
Christos Bazinas ◽  
Michail Manios ◽  
Vassilis G. Kaburlasos

Our interest is in time series classification regarding cyber–physical systems (CPSs) with emphasis in human-robot interaction. We propose an extension of the k nearest neighbor (kNN) classifier to time-series classification using intervals’ numbers (INs). More specifically, we partition a time-series into windows of equal length and from each window data we induce a distribution which is represented by an IN. This preserves the time dimension in the representation. All-order data statistics, represented by an IN, are employed implicitly as features; moreover, parametric non-linearities are introduced in order to tune the geometrical relationship (i.e., the distance) between signals and consequently tune classification performance. In conclusion, we introduce the windowed IN kNN (WINkNN) classifier whose application is demonstrated comparatively in two benchmark datasets regarding, first, electroencephalography (EEG) signals and, second, audio signals. The results by WINkNN are superior in both problems; in addition, no ad-hoc data preprocessing is required. Potential future work is discussed.


2009 ◽  
Vol 19 (12) ◽  
pp. 4197-4215 ◽  
Author(s):  
ANGELIKI PAPANA ◽  
DIMITRIS KUGIUMTZIS

We study some of the most commonly used mutual information estimators, based on histograms of fixed or adaptive bin size, k-nearest neighbors and kernels and focus on optimal selection of their free parameters. We examine the consistency of the estimators (convergence to a stable value with the increase of time series length) and the degree of deviation among the estimators. The optimization of parameters is assessed by quantifying the deviation of the estimated mutual information from its true or asymptotic value as a function of the free parameter. Moreover, some commonly used criteria for parameter selection are evaluated for each estimator. The comparative study is based on Monte Carlo simulations on time series from several linear and nonlinear systems of different lengths and noise levels. The results show that the k-nearest neighbor is the most stable and less affected by the method-specific parameter. A data adaptive criterion for optimal binning is suggested for linear systems but it is found to be rather conservative for nonlinear systems. It turns out that the binning and kernel estimators give the least deviation in identifying the lag of the first minimum of mutual information from nonlinear systems, and are stable in the presence of noise.


2017 ◽  
Vol 52 (3) ◽  
pp. 2019-2037 ◽  
Author(s):  
Francisco Martínez ◽  
María Pilar Frías ◽  
María Dolores Pérez ◽  
Antonio Jesús Rivera

2016 ◽  
Vol 328 ◽  
pp. 42-59 ◽  
Author(s):  
Mabel González ◽  
Christoph Bergmeir ◽  
Isaac Triguero ◽  
Yanet Rodríguez ◽  
José M Benítez

2018 ◽  
Vol 2018 ◽  
pp. 1-8
Author(s):  
John Mashford

Three methods of temporal data upscaling, which may collectively be called the generalized k-nearest neighbor (GkNN) method, are considered. The accuracy of the GkNN simulation of month by month yield is considered (where the term yield denotes the dependent variable). The notion of an eventually well-distributed time series is introduced and on the basis of this assumption some properties of the average annual yield and its variance for a GkNN simulation are computed. The total yield over a planning period is determined and a general framework for considering the GkNN algorithm based on the notion of stochastically dependent time series is described and it is shown that for a sufficiently large training set the GkNN simulation has the same statistical properties as the training data. An example of the application of the methodology is given in the problem of simulating yield of a rainwater tank given monthly climatic data.


2020 ◽  
Vol 10 (2) ◽  
pp. 152-158
Author(s):  
Iswanto ◽  
Yuliana Melita Pranoto ◽  
Reddy Alexandro Harianto

Abstract- Having a sophisticated application, even though often experience problems in deciding BUY - SELL in trading forex trading. This is due to the often time series predictions, in the high variable experiencing high values ​​as well as low variables, for that it is needed a recommendation system to overcome this problem. The application of classification algorithms to the recommendation system in support of BUY-SELL decisions is one appropriate alternative to overcome this. K-Nearest Neighbor (K-NN) algorithm was chosen because the K-NN method is an algorithm that can be used in building a recommendation system that can classify data based on the closest distance. This system is designed to assist traders in making BUY-SELL decisions, based on predictive data. The results of the recommendation system from the ten trials predicted by Arima are recommended. When compared to the price in the field the target profit is 7% per week from ten experiments if the average profit has exceeded the target


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Omobolanle Ruth Ogunseiju ◽  
Johnson Olayiwola ◽  
Abiola Abosede Akanmu ◽  
Chukwuma Nnaji

PurposeConstruction action recognition is essential to efficiently manage productivity, health and safety risks. These can be achieved by tracking and monitoring construction work. This study aims to examine the performance of a variant of deep convolutional neural networks (CNNs) for recognizing actions of construction workers from images of signals of time-series data.Design/methodology/approachThis paper adopts Inception v1 to classify actions involved in carpentry and painting activities from images of motion data. Augmented time-series data from wearable sensors attached to worker's lower arms are converted to signal images to train an Inception v1 network. Performance of Inception v1 is compared with the highest performing supervised learning classifier, k-nearest neighbor (KNN).FindingsResults show that the performance of Inception v1 network improved when trained with signal images of the augmented data but at a high computational cost. Inception v1 network and KNN achieved an accuracy of 95.2% and 99.8%, respectively when trained with 50-fold augmented carpentry dataset. The accuracy of Inception v1 and KNN with 10-fold painting augmented dataset is 95.3% and 97.1%, respectively.Research limitations/implicationsOnly acceleration data of the lower arm of the two trades were used for action recognition. Each signal image comprises 20 datasets.Originality/valueLittle has been reported on recognizing construction workers' actions from signal images. This study adds value to the existing literature, in particular by providing insights into the extent to which a deep CNN can classify subtasks from patterns in signal images compared to a traditional best performing shallow network.


Sign in / Sign up

Export Citation Format

Share Document