Effective Filter Pruning Method Using Additional Downsampled Image for Global Pooling Applied CNN
Deep Convolutional Neural Networks (CNNs) show remarkable performance in many areas. However, most of the applications require huge computational costs and massive memory, which are hard to obtain in devices with a relatively weak performance like embedded devices. To reduce the computational cost, and meantime, to preserve the performance of the trained deep CNN, we propose a new filter pruning method using an additional dataset derived by downsampling the original dataset. Our method takes advantage of the fact that information in high-resolution images is lost in the downsampling process. Each trained convolutional filter reacts differently to this information loss. Based on this, the importance of the filter is evaluated by comparing the gradient obtained from two different resolution images. We validate the superiority of our filter evaluation method using a VGG-16 model trained on CIFAR-10 and CUB-200-2011 datasets. The pruned network with our method shows an average of 2.66% higher accuracy in the latter dataset, compared to existing pruning methods when about 75% of the parameters are removed.