nonparametric discriminant analysis
Recently Published Documents


TOTAL DOCUMENTS

39
(FIVE YEARS 2)

H-INDEX

11
(FIVE YEARS 0)

2020 ◽  
Vol 142 ◽  
pp. 106817
Author(s):  
Weichang Yu ◽  
Lamiae Azizi ◽  
John T. Ormerod

2017 ◽  
Vol 24 (10) ◽  
pp. 1537-1541 ◽  
Author(s):  
Guanqun Cao ◽  
Alexandros Iosifidis ◽  
Moncef Gabbouj

Open Physics ◽  
2017 ◽  
Vol 15 (1) ◽  
pp. 270-279
Author(s):  
Quanbao Li ◽  
Fajie Wei ◽  
Shenghan Zhou

AbstractThe linear discriminant analysis (LDA) is one of popular means for linear feature extraction. It usually performs well when the global data structure is consistent with the local data structure. Other frequently-used approaches of feature extraction usually require linear, independence, or large sample condition. However, in real world applications, these assumptions are not always satisfied or cannot be tested. In this paper, we introduce an adaptive method, local kernel nonparametric discriminant analysis (LKNDA), which integrates conventional discriminant analysis with nonparametric statistics. LKNDA is adept in identifying both complex nonlinear structures and the ad hoc rule. Six simulation cases demonstrate that LKNDA have both parametric and nonparametric algorithm advantages and higher classification accuracy. Quartic unilateral kernel function may provide better robustness of prediction than other functions. LKNDA gives an alternative solution for discriminant cases of complex nonlinear feature extraction or unknown feature extraction. At last, the application of LKNDA in the complex feature extraction of financial market activities is proposed.


2016 ◽  
Vol 12 (S325) ◽  
pp. 201-204 ◽  
Author(s):  
Graham Barnes ◽  
Nicole Schanche ◽  
K. D. Leka ◽  
Ashna Aggarwal ◽  
Kathy Reeves

AbstractWe compare the results of using a Random Forest Classifier with the results of using Nonparametric Discriminant Analysis to classify whether a filament channel (in the case of a filament eruption) or an active region (in the case of a flare) is about to produce an event. A large number of descriptors are considered in each case, but it is found that only a small number are needed in order to get most of the improvement in performance over always predicting the majority class. There is little difference in performance between the two classifiers, and neither results in substantial improvements over simply predicting the majority class.


Sign in / Sign up

Export Citation Format

Share Document