CLASSIFYING APPLES BY THE MEANS OF FLUORESCENCE IMAGING
Classification of harvested apples when predicting their storage potential is an important task. This paper describes how chlorophyll a fluorescence images taken in blue light through a red filter, can be used to classify apples. In such an image, fluorescence appears as a relatively homogenous area broken by a number of small nonfluorescing spots, corresponding to normal corky tissue patches, lenticells, and to damaged areas that lower the quality of the apple. The damaged regions appear more longish, curved or boat-shaped compared to the roundish, regular lenticells. We propose an apple classification method that employs a hierarchy of two neural networks. The first network classifies each spot according to geometrical criteria and the second network uses this information together with global attributes to classify the apple. The system reached 95% accuracy using a test material classified by an expert for "bad" and "good" apples.