scholarly journals AugFPN: Improving Multi-Scale Feature Learning for Object Detection

Author(s):  
Chaoxu Guo ◽  
Bin Fan ◽  
Qian Zhang ◽  
Shiming Xiang ◽  
Chunhong Pan
2021 ◽  
pp. 81-89
Author(s):  
Zhenyu Zhao ◽  
Yachao Fang ◽  
Qing Zhang ◽  
Xiaowei Chen ◽  
Meng Dai ◽  
...  

IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 182105-182116
Author(s):  
Pengyu Zhang ◽  
Zhe Zhang ◽  
Yanpeng Hao ◽  
Zhiheng Zhou ◽  
Bing Luo ◽  
...  

2020 ◽  
Vol 194 ◽  
pp. 102881
Author(s):  
Michael Edwards ◽  
Xianghua Xie ◽  
Robert I. Palmer ◽  
Gary K.L. Tam ◽  
Rob Alcock ◽  
...  

2020 ◽  
Vol 16 (3) ◽  
pp. 132-145
Author(s):  
Gang Liu ◽  
Chuyi Wang

Neural network models have been widely used in the field of object detecting. The region proposal methods are widely used in the current object detection networks and have achieved well performance. The common region proposal methods hunt the objects by generating thousands of the candidate boxes. Compared to other region proposal methods, the region proposal network (RPN) method improves the accuracy and detection speed with several hundred candidate boxes. However, since the feature maps contains insufficient information, the ability of RPN to detect and locate small-sized objects is poor. A novel multi-scale feature fusion method for region proposal network to solve the above problems is proposed in this article. The proposed method is called multi-scale region proposal network (MS-RPN) which can generate suitable feature maps for the region proposal network. In MS-RPN, the selected feature maps at multiple scales are fine turned respectively and compressed into a uniform space. The generated fusion feature maps are called refined fusion features (RFFs). RFFs incorporate abundant detail information and context information. And RFFs are sent to RPN to generate better region proposals. The proposed approach is evaluated on PASCAL VOC 2007 and MS COCO benchmark tasks. MS-RPN obtains significant improvements over the comparable state-of-the-art detection models.


2021 ◽  
Author(s):  
Jesús García Fernández ◽  
Siamak Mehrkanoon

2020 ◽  
Vol 100 ◽  
pp. 107149 ◽  
Author(s):  
Wenchi Ma ◽  
Yuanwei Wu ◽  
Feng Cen ◽  
Guanghui Wang

2019 ◽  
Vol 11 (7) ◽  
pp. 755 ◽  
Author(s):  
Xiaodong Zhang ◽  
Kun Zhu ◽  
Guanzhou Chen ◽  
Xiaoliang Tan ◽  
Lifei Zhang ◽  
...  

Object detection on very-high-resolution (VHR) remote sensing imagery has attracted a lot of attention in the field of image automatic interpretation. Region-based convolutional neural networks (CNNs) have been vastly promoted in this domain, which first generate candidate regions and then accurately classify and locate the objects existing in these regions. However, the overlarge images, the complex image backgrounds and the uneven size and quantity distribution of training samples make the detection tasks more challenging, especially for small and dense objects. To solve these problems, an effective region-based VHR remote sensing imagery object detection framework named Double Multi-scale Feature Pyramid Network (DM-FPN) was proposed in this paper, which utilizes inherent multi-scale pyramidal features and combines the strong-semantic, low-resolution features and the weak-semantic, high-resolution features simultaneously. DM-FPN consists of a multi-scale region proposal network and a multi-scale object detection network, these two modules share convolutional layers and can be trained end-to-end. We proposed several multi-scale training strategies to increase the diversity of training data and overcome the size restrictions of the input images. We also proposed multi-scale inference and adaptive categorical non-maximum suppression (ACNMS) strategies to promote detection performance, especially for small and dense objects. Extensive experiments and comprehensive evaluations on large-scale DOTA dataset demonstrate the effectiveness of the proposed framework, which achieves mean average precision (mAP) value of 0.7927 on validation dataset and the best mAP value of 0.793 on testing dataset.


Sign in / Sign up

Export Citation Format

Share Document