scholarly journals Dimension Reduction Big Data Using Recognition of Data Features Based on Copula Function and Principal Component Analysis

2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Fazel Badakhshan Farahabadi ◽  
Kianoush Fathi Vajargah ◽  
Rahman Farnoosh

Nowadays, data are generated in the world with high speed; therefore, recognizing features and dimensions reduction of data without losing useful information is of high importance. There are many ways to dimension reduction, including principal component analysis (PCA) method, which is by identifying effective dimensions in an acceptable level, reducing dimension of data. In the usual method of principal component analysis, data are usually normal, or we normalize data; then, the principal component analysis method is used. Many studies have been done on the principal component analysis method as a step of data preparation. In this paper, we propose a method that improves the principal component analysis method and makes data analysis easier and more efficient. Also, we first identify the relationships between the data by fitting the multivariate copula function to data and simulate new data using the estimated parameters; then, we reduce the dimensions of new data by principal component analysis method; the aim is to improve the performance of the principal component analysis method to find effective dimensions.

2020 ◽  
Vol 17 (4) ◽  
pp. 172988141989688
Author(s):  
Liming Li ◽  
Jing Zhao ◽  
Chunrong Wang ◽  
Chaojie Yan

The multivariate statistical method such as principal component analysis based on linear dimension reduction and kernel principal component analysis based on nonlinear dimension reduction as the modified principal component analysis method are commonly used. Because of the diversity and correlation of robotic global performance indexes, the two multivariate statistical methods principal component analysis and kernel principal component analysis methods can be used, respectively, to comprehensively evaluate the global performance of PUMA560 robot with different dimensions. When using the kernel principal component analysis method, the kernel function and parameters directly have an effect on the result of comprehensive performance evaluation. Because kernel principal component analysis with polynomial kernel function is time-consuming and inefficient, a new kernel function based on similarity degree is proposed for the big sample data. The new kernel function is proved according to Mercer’s theorem. By comparing different dimension reduction effects of principal component analysis method, the kernel principal component analysis method with polynomial kernel function, and the kernel principal component analysis method with the new kernel function, the kernel principal component analysis method with the new kernel function could deal more effectively with the nonlinear relationship among indexes, and its calculation result is more reasonable for containing more comprehensive information. The simulation shows that the kernel principal component analysis method with the new kernel function has the advantage of low time consuming, good real-time performance, and good ability of generalization.


2011 ◽  
Vol 26 ◽  
pp. 1346-1351
Author(s):  
Yang Guo-liang ◽  
Wang Can-zhao ◽  
Wu Shi-yue ◽  
Jia Li-qing ◽  
Zhang Sheng-zhu

Sign in / Sign up

Export Citation Format

Share Document