scholarly journals Ensemble Learning Based Multiple Kernel Principal Component Analysis for Dimensionality Reduction and Classification of Hyperspectral Imagery

2018 ◽  
Vol 2018 ◽  
pp. 1-14 ◽  
Author(s):  
Hamidullah Binol

Classification is one of the most challenging tasks of remotely sensed data processing, particularly for hyperspectral imaging (HSI). Dimension reduction is widely applied as a preprocessing step for classification; however the reduction of dimension using conventional methods may not always guarantee high classification rate. Principal component analysis (PCA) and its nonlinear version kernel PCA (KPCA) are known as traditional dimension reduction algorithms. In a previous work, a variant of KPCA, denoted as Adaptive KPCA (A-KPCA), is suggested to get robust unsupervised feature representation for HSI. The specified technique employs several KPCAs simultaneously to obtain better feature points from each applied KPCA which includes different candidate kernels. Nevertheless, A-KPCA neglects the influence of subkernels employing an unweighted combination. Furthermore, if there is at least one weak kernel in the set of kernels, the classification performance may be reduced significantly. To address these problems, in this paper we propose an Ensemble Learning (EL) based multiple kernel PCA (M-KPCA) strategy. M-KPCA constructs a weighted combination of kernels with high discriminative ability from a predetermined set of base kernels and then extracts features in an unsupervised fashion. The experiments on two different AVIRIS hyperspectral data sets show that the proposed algorithm can achieve a satisfactory feature extraction performance on real data.

2013 ◽  
Vol 303-306 ◽  
pp. 1101-1104 ◽  
Author(s):  
Yong De Hu ◽  
Jing Chang Pan ◽  
Xin Tan

Kernel entropy component analysis (KECA) reveals the original data’s structure by kernel matrix. This structure is related to the Renyi entropy of the data. KECA maintains the invariance of the original data’s structure by keeping the data’s Renyi entropy unchanged. This paper described the original data by several components on the purpose of dimension reduction. Then the KECA was applied in celestial spectra reduction and was compared with Principal Component Analysis (PCA) and Kernel Principal Component Analysis (KPCA) by experiments. Experimental results show that the KECA is a good method in high-dimensional data reduction.


2007 ◽  
Vol 04 (01) ◽  
pp. 15-26 ◽  
Author(s):  
XIUQING WANG ◽  
ZENG-GUANG HOU ◽  
LONG CHENG ◽  
MIN TAN ◽  
FEI ZHU

The ability of cognition and recognition for complex environment is very important for a real autonomous robot. A new scene analysis method using kernel principal component analysis (kernel-PCA) for mobile robot based on multi-sonar-ranger data fusion is put forward. The principle of classification by principal component analysis (PCA), kernel-PCA, and the BP neural network (NN) approach to extract the eigenvectors which have the largest k eigenvalues are introduced briefly. Next the details of PCA, kernel-PCA and the BP NN method applied in the corridor scene analysis and classification for the mobile robots based on sonar data are discussed and the experimental results of those methods are given. In addition, a corridor-scene-classifier based on BP NN is discussed. The experimental results using PCA, kernel-PCA and the methods based on BP neural networks (NNs) are compared and the robustness of those methods are also analyzed. Such conclusions are drawn: in corridor scene classification, the kernel-PCA method has advantage over the ordinary PCA, and the approaches based on BP NNs can also get satisfactory results. The robustness of kernel-PCA is better than that of the methods based on BP NNs.


2020 ◽  
Vol 17 (4) ◽  
pp. 172988141989688
Author(s):  
Liming Li ◽  
Jing Zhao ◽  
Chunrong Wang ◽  
Chaojie Yan

The multivariate statistical method such as principal component analysis based on linear dimension reduction and kernel principal component analysis based on nonlinear dimension reduction as the modified principal component analysis method are commonly used. Because of the diversity and correlation of robotic global performance indexes, the two multivariate statistical methods principal component analysis and kernel principal component analysis methods can be used, respectively, to comprehensively evaluate the global performance of PUMA560 robot with different dimensions. When using the kernel principal component analysis method, the kernel function and parameters directly have an effect on the result of comprehensive performance evaluation. Because kernel principal component analysis with polynomial kernel function is time-consuming and inefficient, a new kernel function based on similarity degree is proposed for the big sample data. The new kernel function is proved according to Mercer’s theorem. By comparing different dimension reduction effects of principal component analysis method, the kernel principal component analysis method with polynomial kernel function, and the kernel principal component analysis method with the new kernel function, the kernel principal component analysis method with the new kernel function could deal more effectively with the nonlinear relationship among indexes, and its calculation result is more reasonable for containing more comprehensive information. The simulation shows that the kernel principal component analysis method with the new kernel function has the advantage of low time consuming, good real-time performance, and good ability of generalization.


2018 ◽  
Author(s):  
Toni Bakhtiar

Kernel Principal Component Analysis (Kernel PCA) is a generalization of the ordinary PCA which allows mapping the original data into a high-dimensional feature space. The mapping is expected to address the issues of nonlinearity among variables and separation among classes in the original data space. The key problem in the use of kernel PCA is the parameter estimation used in kernel functions that so far has not had quite obvious guidance, where the parameter selection mainly depends on the objectivity of the research. This study exploited the use of Gaussian kernel function and focused on the ability of kernel PCA in visualizing the separation of the classified data. Assessments were undertaken based on misclassification obtained by Fisher Discriminant Linear Analysis of the first two principal components. This study results suggest for the visualization of kernel PCA by selecting the parameter in the interval between the closest and the furthest distances among the objects of original data is better than that of ordinary PCA.


Author(s):  
Liming Li ◽  
Jing Zhao

Revealing the relations among parallel mechanism and robot comprehensive performance, topological structure and dimension is the basis to optimize mechanism. Due to the correlation and diversity of the single performance indexes, statistical principles of linear dimension reduction and nonlinear dimension reduction were introduced into comprehensive performance analysis and evaluation for typical parallel mechanisms and robots. Then the mechanism’s topological structure and dimension with the best comprehensive performance could be selected based on principal component analysis (PCA) and kernel principal component analysis (KPCA) respectively. Through comparing the results, KPCA could reveal the nonlinear relationship among different single performance indexes to provide more comprehensive performance evaluation information than PCA, and indicate the numerical calculation relations among comprehensive performance, topological structure and dimension validly.


d'CARTESIAN ◽  
2015 ◽  
Vol 4 (1) ◽  
pp. 95
Author(s):  
Vitawati Bawotong ◽  
Hanny Komalig ◽  
Nelson Nainggolan

Kernel PCA merupakan PCA yang diaplikasikan pada input data yang telah ditransformasikan ke feature space. Misalkan F: Rn®F fungsi yang memetakan semua input data xiÎRn, berlaku F(xi)ÎF. Salah satu dari banyak fungsi kernel adalah power kernel. Fungsi power kernel K(xi, xj) = –|| xi – xj ||b dengan 0 < b ≤ 1. Tujuan dari penelitian ini yaitu mempelajari penggunaan Kernel PCA (KPCA) dengan fungsi Power Kernel untuk membantu menyelesaikan masalah plot multivariate nonlinier terutama yang berhubungan dalam pengelompokan. Hasil menunjukkan bahwa Penggunaan KPCA dengan fungsi Power Kernel sangat membantu dalam menyelesaikan masalah plot multivariate yang belum dapat dikelompokan dengan garis pemisah yang linier. Kata kunci : Kernel Principal Component Analysis (KPCA), Plot Multivariate, Power Kernel


2017 ◽  
Vol 14 (2) ◽  
Author(s):  
Nora K. Speicher ◽  
Nico Pfeifer

AbstractPersonalized treatment of patients based on tissue-specific cancer subtypes has strongly increased the efficacy of the chosen therapies. Even though the amount of data measured for cancer patients has increased over the last years, most cancer subtypes are still diagnosed based on individual data sources (e.g. gene expression data). We propose an unsupervised data integration method based on kernel principal component analysis. Principal component analysis is one of the most widely used techniques in data analysis. Unfortunately, the straightforward multiple kernel extension of this method leads to the use of only one of the input matrices, which does not fit the goal of gaining information from all data sources. Therefore, we present a scoring function to determine the impact of each input matrix. The approach enables visualizing the integrated data and subsequent clustering for cancer subtype identification. Due to the nature of the method, no hyperparameters have to be set. We apply the methodology to five different cancer data sets and demonstrate its advantages in terms of results and usability.


Author(s):  
Guang-Ho Cha

Principal component analysis (PCA) is an important tool in many areas including data reduction and interpretation, information retrieval, image processing, and so on. Kernel PCA has recently been proposed as a nonlinear extension of the popular PCA. The basic idea is to first map the input space into a feature space via a nonlinear map and then compute the principal components in that feature space. This paper illustrates the potential of kernel PCA for dimensionality reduction and feature extraction in multimedia retrieval. By the use of Gaussian kernels, the principal components were computed in the feature space of an image data set and they are used as new dimensions to approximate image features. Extensive experimental results show that kernel PCA performs better than linear PCA with respect to the retrieval quality as well as the retrieval precision in content-based image retrievals.Keywords: Principal component analysis, kernel principal component analysis, multimedia retrieval, dimensionality reduction, image retrieval


d'CARTESIAN ◽  
2015 ◽  
Vol 4 (1) ◽  
pp. 76
Author(s):  
Sueharti Maatuil ◽  
Hanny Komalig ◽  
Charles Mongi

Tujuan dari penelitian ini yaitu mempelajari penggunaan kernel PCA fungsi polinomial untuk membantu menyelesaikan masalah plot peubah ganda terutama yang berhubungan dalam pengelompokan. Data yang digunakan dalam penelitian ini adalah data sekunder yang berupa plot peubah ganda. Metode kernel adalah salah satu cara untuk mengatasi kasus-kasus yang tidak linier. Kernel PCA merupakan PCA yang diaplikasikan pada input data yang telah ditransformasikan ke feature space. Misalkan F: Rn®F fungsi yang memetakan semua input data xiÎRn, berlaku F(xi)ÎF. Salah satu kernel yang banyak digunakan adalah kernel polinomial. Dimana h0 adalah parameter skala yang akan dipilih. Fungsi kernel polynomial  K(xi, xj‘) = (xiT, xj‘ + h0)d. Hasil dari penelitian ini menunjukkan bahwa penggunaan Kernel Principal Component Analysis (KPCA) dengan fungsi kernel polinomial sangat membantu dalam menyelesaikan masalah plot peubah ganda yang belum dapat dikelompokan dengan garis pemisah yang linier. Kata kunci : Kernel PCA, Kernel PCA Fungsi Polinomial, Plot Peubah Ganda


Sign in / Sign up

Export Citation Format

Share Document