centroid classifier
Recently Published Documents


TOTAL DOCUMENTS

24
(FIVE YEARS 6)

H-INDEX

5
(FIVE YEARS 0)

2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Magdiel Jiménez-Guarneros ◽  
Jonas Grande-Barreto ◽  
Jose de Jesus Rangel-Magdaleno

Early detection of fault events through electromechanical systems operation is one of the most attractive and critical data challenges in modern industry. Although these electromechanical systems tend to experiment with typical faults, a common event is that unexpected and unknown faults can be presented during operation. However, current models for automatic detection can learn new faults at the cost of forgetting concepts previously learned. This article presents a multiclass incremental learning (MCIL) framework based on 1D convolutional neural network (CNN) for fault detection in induction motors. The presented framework tackles the forgetting problem by storing a representative exemplar set from past data (known faults) in memory. Then, the 1D CNN is fine-tuned over the selected exemplar set and data from new faults. Test samples are classified using nearest centroid classifier (NCC) in the feature space from 1D CNN. The proposed framework was evaluated and validated over two public datasets for fault detection in induction motors (IMs): asynchronous motor common fault (AMCF) and Case Western Reserve University (CWRU). Experimental results reveal the proposed framework as an effective solution to incorporate and detect new induction motor faults to already known, with a high accuracy performance across different incremental phases.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Sonika Johri ◽  
Shantanu Debnath ◽  
Avinash Mocherla ◽  
Alexandros SINGK ◽  
Anupam Prakash ◽  
...  

AbstractQuantum machine learning has seen considerable theoretical and practical developments in recent years and has become a promising area for finding real world applications of quantum computers. In pursuit of this goal, here we combine state-of-the-art algorithms and quantum hardware to provide an experimental demonstration of a quantum machine learning application with provable guarantees for its performance and efficiency. In particular, we design a quantum Nearest Centroid classifier, using techniques for efficiently loading classical data into quantum states and performing distance estimations, and experimentally demonstrate it on a 11-qubit trapped-ion quantum machine, matching the accuracy of classical nearest centroid classifiers for the MNIST handwritten digits dataset and achieving up to 100% accuracy for 8-dimensional synthetic data.


2020 ◽  
Vol 5 (1) ◽  
pp. 57
Author(s):  
Aditya Hari Bawono ◽  
Fitra Abdurrahman Bahtiar ◽  
Ahmad Afif Supianto

Classification method is misled by outlier. However, there are few research of classification with outlier removal, especially for Nearest Centroid Classifier Method. The proposed methodology consists of two stages. First, preprocess the data with outlier removal, removes points which are far from the corresponding centroid. Second, classify the outlier removed data. The experiment covers six data sets which have different characteristic. The results indicate that outlier removal as preprocessing method provide better result for improving Nearest Centroid Classifier performance on most data set.


Author(s):  
Venkatachalam K ◽  
Karthikeyan NK

<p>Text preprocessing and document classification plays a vital role in web services discovery. Nearest centroid classifiers were mostly employed in high-dimensional application including genomics. Feature selection is a major problem in all classifiers and in this paper we propose to use an effective feature selection procedure followed by web services discovery through Centroid classifier algorithm. The task here in this problem statement is to effectively assign a document to one or more classes. Besides being simple and robust, the centroid classifier s not effectively used for document classification due to the computational complexity and larger memory requirements. We address these problems through dimensionality reduction and effective feature set selection before training and testing the classifier. Our preliminary experimentation and results shows that the proposed method outperforms other algorithms mentioned in the literature including K-Nearest neighbors, Naive Bayes classifier and Support Vector Machines.</p>


Sign in / Sign up

Export Citation Format

Share Document