Feature extraction and wall motion classification of 2D stress echocardiography with relevance vector machines

Author(s):  
Kiryl Chykeyuk ◽  
David A. Clifton ◽  
J. Alison Noble
2020 ◽  
Vol 37 (5) ◽  
pp. 812-822
Author(s):  
Behnam Asghari Beirami ◽  
Mehdi Mokhtarzade

In this paper, a novel feature extraction technique called SuperMNF is proposed, which is an extension of the minimum noise fraction (MNF) transformation. In SuperMNF, each superpixel has its own transformation matrix and MNF transformation is performed on each superpixel individually. The basic idea behind the SuperMNF is that each superpixel contains its specific signal and noise covariance matrices which are different from the adjacent superpixels. The extracted features, owning spatial-spectral content and provided in the lower dimension, are classified by maximum likelihood classifier and support vector machines. Experiments that are conducted on two real hyperspectral images, named Indian Pines and Pavia University, demonstrate the efficiency of SuperMNF since it yielded more promising results than some other feature extraction methods (MNF, PCA, SuperPCA, KPCA, and MMP).


2009 ◽  
Vol 140 (1) ◽  
pp. 143-148 ◽  
Author(s):  
Xiaodong Wang ◽  
Meiying Ye ◽  
C.J. Duanmu

Sensors ◽  
2020 ◽  
Vol 20 (8) ◽  
pp. 2403
Author(s):  
Jakub Browarczyk ◽  
Adam Kurowski ◽  
Bozena Kostek

The aim of the study is to compare electroencephalographic (EEG) signal feature extraction methods in the context of the effectiveness of the classification of brain activities. For classification, electroencephalographic signals were obtained using an EEG device from 17 subjects in three mental states (relaxation, excitation, and solving logical task). Blind source separation employing independent component analysis (ICA) was performed on obtained signals. Welch’s method, autoregressive modeling, and discrete wavelet transform were used for feature extraction. Principal component analysis (PCA) was performed in order to reduce the dimensionality of feature vectors. k-Nearest Neighbors (kNN), Support Vector Machines (SVM), and Neural Networks (NN) were employed for classification. Precision, recall, F1 score, as well as a discussion based on statistical analysis, were shown. The paper also contains code utilized in preprocessing and the main part of experiments.


2006 ◽  
Author(s):  
Xiaoxia Yin ◽  
Brian W.-H. Ng ◽  
Bernd Fischer ◽  
Bradley Ferguson ◽  
Samuel P. Mickan ◽  
...  

2006 ◽  
Vol 15 (03) ◽  
pp. 411-432 ◽  
Author(s):  
GEORGE GEORGOULAS ◽  
CHRYSOSTOMOS STYLIOS ◽  
PETER GROUMPOS

Since the fetus is not available for direct observations, only indirect information can guide the obstetrician in charge. Electronic Fetal Monitoring (EFM) is widely used for assessing fetal well being. EFM involves detection of the Fetal Heart Rate (FHR) signal and the Uterine Activity (UA) signal. The most serious fetal incident is the hypoxic injury leading to cerebral palsy or even death, which is a condition that must be predicted and avoided. This research work proposes a new integrated method for feature extraction and classification of the FHR signal able to associate FHR with umbilical artery pH values at delivery. The proposed method introduces the use of the Discrete Wavelet Transform (DWT) to extract time-scale dependent features of the FHR signal and the use of Support Vector Machines (SVMs) for the categorization. The proposed methodology is tested on a data set of intrapartum recordings were the FHR categories are associated with umbilical artery pH values, This proposed approach achieved high overall classification performance proving its merits.


2010 ◽  
Vol 07 (04) ◽  
pp. 347-356
Author(s):  
E. SIVASANKAR ◽  
R. S. RAJESH

In this paper, Principal Component Analysis is used for feature extraction, and a statistical learning based Support Vector Machine is designed for functional classification of clinical data. Appendicitis data collected from BHEL Hospital, Trichy is taken and classified under three classes. Feature extraction transforms the data in the high-dimensional space to a space of fewer dimensions. The classification is done by constructing an optimal hyperplane that separates the members from the nonmembers of the class. For linearly nonseparable data, Kernel functions are used to map data to a higher dimensional space and there the optimal hyperplane is found. This paper works with different SVMs based on radial basis and polynomial kernels, and their performances are compared.


Sign in / Sign up

Export Citation Format

Share Document