FEATURE FUSION AND MODEL SELECTION BASED ON INFORMATION CRITERION
We describe the k-NN adaptive metric learning procedure combining the asymptotic variance estimation and fine adjustment of the metric parameters for the face recognition. The metric learning model based on the Mahalanobis distance suffered from the degraded performance due to the limitation of available training samples. The feature fusion methods are proposed to assume local distributions of feature patterns for the parameter estimation and learning. Firstly, the MDL criterion is formulated to decide on the trade-offs between accuracy and complexity of an asymptotic statistical model. The variance within the classes is minimized by using the asymptotic variance estimation. Secondly, optimal metric parameters are derived from the minimization of the negative log-likelihood function for the presentation of the synthesized feature patterns. The variance between the classes is increased by using the simulated annealing method. We present the simulation results using the ORL and UMIST databases.