scholarly journals Multiscale fault classification framework using kernel principal component analysis and k-nearest neighbors for chemical process system

2021 ◽  
Vol 287 ◽  
pp. 03011
Author(s):  
Muhammad Nawaz ◽  
Abdulhalim Shah Maulud ◽  
Haslinda Zabiri

Process monitoring techniques in chemical process systems help to improve product quality and plant safety. Multiscale classification plays a crucial role in the monitoring of chemical processes. However, there is a problem in coping with high-dimensional correlated data produced by complex, nonlinear processes. Therefore, an improved multiscale fault classification framework has been proposed to enhance the fault classification ability in nonlinear chemical process systems. This framework combines wavelet transform (WT), kernel principal component analysis (KPCA), and K nearest neighbors (KNN) classifier. Initially, a moving window-based WT is used to extract multiscale information from process data in time and frequency simultaneously at different scales. The resulting wavelet coefficients are reconstructed and fed into the KPCA to produce feature vectors. In the final step, these vectors have been used as inputs for the KNN classifier. The performance of the proposed multi-scale KPCA-KNN framework is analyzed and compared using a continuous stirred tank reactor (CSTR) system as a case study. The results show that the proposed multiscale KPCA-KNN framework has a high success rate over PCA-KNN and KPCA-KNN methods.

2021 ◽  
Vol 15 (2) ◽  
pp. 131-144
Author(s):  
Redha Taguelmimt ◽  
Rachid Beghdad

On one hand, there are many proposed intrusion detection systems (IDSs) in the literature. On the other hand, many studies try to deduce the important features that can best detect attacks. This paper presents a new and an easy-to-implement approach to intrusion detection, named distance sum-based k-nearest neighbors (DS-kNN), which is an improved version of k-NN classifier. Given a data sample to classify, DS-kNN computes the distance sum of the k-nearest neighbors of the data sample in each of the possible classes of the dataset. Then, the data sample is assigned to the class having the smallest sum. The experimental results show that the DS-kNN classifier performs better than the original k-NN algorithm in terms of accuracy, detection rate, false positive, and attacks classification. The authors mainly compare DS-kNN to CANN, but also to SVM, S-NDAE, and DBN. The obtained results also show that the approach is very competitive.


1996 ◽  
Vol 29 (1) ◽  
pp. 6072-6077 ◽  
Author(s):  
Ben McKay ◽  
Justin Elsey ◽  
Mark J. Willis ◽  
Geoffrey W. Barton

Sign in / Sign up

Export Citation Format

Share Document