Direct computation of shape cues by scale-space operations

Author(s):  
Tony Lindeberg
Author(s):  
Tony Lindeberg

AbstractThis paper presents a hybrid approach between scale-space theory and deep learning, where a deep learning architecture is constructed by coupling parameterized scale-space operations in cascade. By sharing the learnt parameters between multiple scale channels, and by using the transformation properties of the scale-space primitives under scaling transformations, the resulting network becomes provably scale covariant. By in addition performing max pooling over the multiple scale channels, or other permutation-invariant pooling over scales, a resulting network architecture for image classification also becomes provably scale invariant. We investigate the performance of such networks on the MNIST Large Scale dataset, which contains rescaled images from the original MNIST dataset over a factor of 4 concerning training data and over a factor of 16 concerning testing data. It is demonstrated that the resulting approach allows for scale generalization, enabling good performance for classifying patterns at scales not spanned by the training data.


1993 ◽  
Author(s):  
Rhoda Hornstein ◽  
Jo Gardner ◽  
Scott Stodghill

1997 ◽  
Author(s):  
Wallace Tai ◽  
Wallace Tai
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document