Discriminate Supervised Weighted Scheme for the Classification of Time Series Signals

Author(s):  
Elangovan Ramanujam ◽  
S. Padmavathi

Innovations and applicability of time series data mining techniques have significantly increased the researchers' interest in the problem of time series classification. Several algorithms have been proposed for this purpose categorized under shapelet, interval, motif, and whole series-based techniques. Among this, the bag-of-words technique, an extensive application of the text mining approach, performs well due to its simplicity and effectiveness. To extend the efficiency of the bag-of-words technique, this paper proposes a discriminate supervised weighted scheme to identify the characteristic and representative pattern of a class for efficient classification. This paper uses a modified weighted matrix that discriminates the representative and non-representative pattern which enables the interpretability in classification. Experimentation has been carried out to compare the performance of the proposed technique with state-of-the-art techniques in terms of accuracy and statistical significance.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Tuan D. Pham

AbstractAutomated analysis of physiological time series is utilized for many clinical applications in medicine and life sciences. Long short-term memory (LSTM) is a deep recurrent neural network architecture used for classification of time-series data. Here time–frequency and time–space properties of time series are introduced as a robust tool for LSTM processing of long sequential data in physiology. Based on classification results obtained from two databases of sensor-induced physiological signals, the proposed approach has the potential for (1) achieving very high classification accuracy, (2) saving tremendous time for data learning, and (3) being cost-effective and user-comfortable for clinical trials by reducing multiple wearable sensors for data recording.


2021 ◽  
Vol 352 ◽  
pp. 109080
Author(s):  
Joram van Driel ◽  
Christian N.L. Olivers ◽  
Johannes J. Fahrenfort

1995 ◽  
Vol 115 (3) ◽  
pp. 354-360 ◽  
Author(s):  
Shigeaki Fukuda ◽  
Toshihisa Kosaka ◽  
Sigeru Omatsu

Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
Shaker M Eid ◽  
Aiham Albaeni ◽  
Rebeca Rios ◽  
May Baydoun ◽  
Bolanle Akinyele ◽  
...  

Background: The intent of the 5-yearly Resuscitation Guidelines is to improve outcomes. Previous studies have yielded conflicting reports of a beneficial impact of the 2005 guidelines on out-of-hospital cardiac arrest (OHCA) survival. Using a national database, we examined survival before and after the introduction of both the 2005 and 2010 guidelines. Methods: We used the 2000 through 2012 National Inpatient Sample database to select patients ≥18 years admitted to hospitals in the United States with non-traumatic OHCA (ICD-9 CM codes 427.5 & 427.41). A quasi-experimental (interrupted time series) design was used to compare monthly survival trends. Outcomes for OHCA were compared pre- and post- 2005 and 2010 resuscitation guidelines release as follows: 01/2000-09/2005 vs. 10/2005-9/2010 and 10/2005-9/2010 vs. 10/2010-12/2012. Segmented regression analyses of interrupted time series data were performed to examine changes in survival to hospital discharge. Results: For the pre- and post- guidelines periods, 81600, 69139 and 36556 patients respectively survived to hospital admission following OHCA. Subsequent to the release of the 2005 guidelines, there was a statistically significant worsening in survival trends (β= -0.089, 95% CI -0.163 – -0.016, p =0.018) until the release of the 2010 guidelines when a sharp increase in survival was noted which persisted for the period of study (β= 0.054, 95% CI -0.143 – 0.251, p =0.588) but did not achieve statistical significance (Figure). Conclusion: National clinical guidelines developed to impact outcomes must include mechanisms to assess whether benefit actually occurs. The worsening in OHCA survival following the 2005 guidelines is thought provoking but the improvement following the release of the 2010 guidelines is reassuring and worthy of perpetuation.


Sensors ◽  
2020 ◽  
Vol 20 (7) ◽  
pp. 1908
Author(s):  
Chao Ma ◽  
Xiaochuan Shi ◽  
Wei Li ◽  
Weiping Zhu

In the past decade, time series data have been generated from various fields at a rapid speed, which offers a huge opportunity for mining valuable knowledge. As a typical task of time series mining, Time Series Classification (TSC) has attracted lots of attention from both researchers and domain experts due to its broad applications ranging from human activity recognition to smart city governance. Specifically, there is an increasing requirement for performing classification tasks on diverse types of time series data in a timely manner without costly hand-crafting feature engineering. Therefore, in this paper, we propose a framework named Edge4TSC that allows time series to be processed in the edge environment, so that the classification results can be instantly returned to the end-users. Meanwhile, to get rid of the costly hand-crafting feature engineering process, deep learning techniques are applied for automatic feature extraction, which shows competitive or even superior performance compared to state-of-the-art TSC solutions. However, because time series presents complex patterns, even deep learning models are not capable of achieving satisfactory classification accuracy, which motivated us to explore new time series representation methods to help classifiers further improve the classification accuracy. In the proposed framework Edge4TSC, by building the binary distribution tree, a new time series representation method was designed for addressing the classification accuracy concern in TSC tasks. By conducting comprehensive experiments on six challenging time series datasets in the edge environment, the potential of the proposed framework for its generalization ability and classification accuracy improvement is firmly validated with a number of helpful insights.


2020 ◽  
Vol 497 (4) ◽  
pp. 4843-4856 ◽  
Author(s):  
James S Kuszlewicz ◽  
Saskia Hekker ◽  
Keaton J Bell

ABSTRACT Long, high-quality time-series data provided by previous space missions such as CoRoT and Kepler have made it possible to derive the evolutionary state of red giant stars, i.e. whether the stars are hydrogen-shell burning around an inert helium core or helium-core burning, from their individual oscillation modes. We utilize data from the Kepler mission to develop a tool to classify the evolutionary state for the large number of stars being observed in the current era of K2, TESS, and for the future PLATO mission. These missions provide new challenges for evolutionary state classification given the large number of stars being observed and the shorter observing duration of the data. We propose a new method, Clumpiness, based upon a supervised classification scheme that uses ‘summary statistics’ of the time series, combined with distance information from the Gaia mission to predict the evolutionary state. Applying this to red giants in the APOKASC catalogue, we obtain a classification accuracy of $\sim 91{{\ \rm per\ cent}}$ for the full 4 yr of Kepler data, for those stars that are either only hydrogen-shell burning or also helium-core burning. We also applied the method to shorter Kepler data sets, mimicking CoRoT, K2, and TESS achieving an accuracy $\gt 91{{\ \rm per\ cent}}$ even for the 27 d time series. This work paves the way towards fast, reliable classification of vast amounts of relatively short-time-span data with a few, well-engineered features.


Sign in / Sign up

Export Citation Format

Share Document