Some asymptotic results on the effect of autocorrelation on the error rates of the sample linear discriminant function

1983 ◽  
Vol 16 (1) ◽  
pp. 119-121 ◽  
Author(s):  
C.R.O. Lawoko ◽  
G.J. McLachlan
1979 ◽  
Vol 16 (3) ◽  
pp. 370-381 ◽  
Author(s):  
William R. Dillon

This article is a review of the results, as are available, on the performance of the linear discriminant function in situations where the assumptions of multivariate normality and equal group dispersion structures are violated. Some new results are discussed for the case of classification using discrete variables, and in the case of both binary and continuous variables. In addition, alternative methods which have been proposed, and evaluated, for estimating misclassification error rates are thoroughly reviewed. In all cases, the material is reviewed in terms of practical significance, with particular emphasis on the conditions unfavorable to the performance of each procedure.


2015 ◽  
Vol 7 (4) ◽  
pp. 104
Author(s):  
I. Egbo ◽  
M. Egbo ◽  
S. I. Onyeagu

<p>This paper focuses on the robust classification procedures in two group discriminant analysis with multivariate binary variables. A normal distribution based data set is generated using the R-software statistical analysis system 2.15.3 using Barlett’s approximation to chi-square, the data set was found to be homogenous and was subjected to five linear classifiers namely: maximum likelihood discriminant function, fisher’s linear discriminant function, likelihood ratio function, full multinomial function and nearest neighbour function rule. To judge the performance of these procedures, the apparent error rates for each procedure are obtained for different sample sizes. The results obtained ranked the procedures as follows: fisher’s linear discriminant function, maximum likelihood, full multinomial, likelihood function and nearest neigbour function.</p>


2018 ◽  
Vol 8 (1) ◽  
pp. 113
Author(s):  
A. Nanthakumar

The estimation of the error rates is of vital importance in classification problems as this is used as a basis to choose the best discriminant function; that is, the one with a minimum misclassification error. The quadratic discriminant function (QDF), Euclidean Distance Classifier (EDC), and Fisher&rsquo;s Linear Discriminant Function (FLDC) have been in use for a long time for the purpose of classification. In this paper, we compare the misclassification error rate of the QDF, EDC, and FLDC with the Vine Copulas based on Gaussian and Clayton models. The results were obtained for the general case where the means are unequal and the covariance matrices are unequal.


Sign in / Sign up

Export Citation Format

Share Document