Design and implementation of optimized nearest neighbor classifiers for handwritten digit recognition

Author(s):  
H. Yan
2019 ◽  
Vol 58 (7) ◽  
pp. 2331-2340 ◽  
Author(s):  
Yuxiang Wang ◽  
Ruijin Wang ◽  
Dongfen Li ◽  
Daniel Adu-Gyamfi ◽  
Kaibin Tian ◽  
...  

1995 ◽  
Vol 06 (04) ◽  
pp. 417-423 ◽  
Author(s):  
HONG YAN

The basic Nearest Neighbor Classifier (NNC) is often inefficient for classification in terms of memory space and computing time needed if all training samples are used as prototypes. These problems can be solved by reducing the number of prototypes using clustering algorithms and optimizing the prototypes using a special neural network model. In this paper, we compare the performance of the multilayer neural network and an Optimized Nearest Neighbor Classifier (ONNC) for handwritten digit recognition applications. We show that an ONNC can have the same recognition performance as an equivalent neural network classifier. The ONNC can be efficiently implemented using prototype and variable ranking, partial summation and distance triangular inequality based strategies. It requires the same memory space as, but less, training time and classification time than the neural network.


1991 ◽  
Vol 3 (3) ◽  
pp. 440-449 ◽  
Author(s):  
Yuchun Lee

Results of recent research suggest that carefully designed multilayer neural networks with local “receptive fields” and shared weights may be unique in providing low error rates on handwritten digit recognition tasks. This study, however, demonstrates that these networks, radial basis function (RBF) networks, and k nearest-neighbor (kNN) classifiers, all provide similar low error rates on a large handwritten digit database. The backpropagation network is overall superior in memory usage and classification time but can provide “false positive” classifications when the input is not a digit. The backpropagation network also has the longest training time. The RBF classifier requires more memory and more classification time, but less training time. When high accuracy is warranted, the RBF classifier can generate a more effective confidence judgment for rejecting ambiguous inputs. The simple kNN classifier can also perform handwritten digit recognition, but requires a prohibitively large amount of memory and is much slower at classification. Nevertheless, the simplicity of the algorithm and fast training characteristics makes the kNN classifier an attractive candidate in hardware-assisted classification tasks. These results on a large, high input dimensional problem demonstrate that practical constraints including training time, memory usage, and classification time often constrain classifier selection more strongly than small differences in overall error rate.


Sign in / Sign up

Export Citation Format

Share Document