Dimension reduction for performing discriminant analysis for microarrays

Author(s):  
Sang Hee Lee ◽  
A. K. Singh ◽  
Laxmi P. Gewali
2021 ◽  
Author(s):  
Rongxiu Lu ◽  
Yingjie Cai ◽  
Jianyong Zhu ◽  
Feiping Nie ◽  
Hui Yang

2017 ◽  
Vol 27 (1) ◽  
pp. 169-180 ◽  
Author(s):  
Marton Szemenyei ◽  
Ferenc Vajda

Abstract Dimension reduction and feature selection are fundamental tools for machine learning and data mining. Most existing methods, however, assume that objects are represented by a single vectorial descriptor. In reality, some description methods assign unordered sets or graphs of vectors to a single object, where each vector is assumed to have the same number of dimensions, but is drawn from a different probability distribution. Moreover, some applications (such as pose estimation) may require the recognition of individual vectors (nodes) of an object. In such cases it is essential that the nodes within a single object remain distinguishable after dimension reduction. In this paper we propose new discriminant analysis methods that are able to satisfy two criteria at the same time: separating between classes and between the nodes of an object instance. We analyze and evaluate our methods on several different synthetic and real-world datasets.


2018 ◽  
Vol 291 ◽  
pp. 136-150 ◽  
Author(s):  
Mika Juuti ◽  
Francesco Corona ◽  
Juha Karhunen

Author(s):  
Rong-Hua Li ◽  
Shuang Liang ◽  
George Baciu ◽  
Eddie Chan

Singularity problems of scatter matrices in Linear Discriminant Analysis (LDA) are challenging and have obtained attention during the last decade. Linear Discriminant Analysis via QR decomposition (LDA/QR) and Direct Linear Discriminant analysis (DLDA) are two popular algorithms to solve the singularity problem. This paper establishes the equivalent relationship between LDA/QR and DLDA. They can be regarded as special cases of pseudo-inverse LDA. Similar to LDA/QR algorithm, DLDA can also be considered as a two-stage LDA method. Interestingly, the first stage of DLDA can act as a dimension reduction algorithm. The experiment compares LDA/QR and DLDA algorithms in terms of classification accuracy, computational complexity on several benchmark datasets and compares their first stages. The results confirm the established equivalent relationship and verify their capabilities in dimension reduction.


2015 ◽  
Vol 52 (2) ◽  
pp. 55-74
Author(s):  
Jolanta Grala-Michalak

Abstract Sometimes feature representations of measured individuals are better described by spherical coordinates than Cartesian ones. The author proposes to introduce a preprocessing step in LDA based on the arctangent transformation of spherical coordinates. This nonlinear transformation does not change the dimension of the data, but in combination with LDA it leads to a dimension reduction if the raw data are not linearly separated. The method is presented using various examples of real and artificial data.


Sign in / Sign up

Export Citation Format

Share Document