End-to-end Automatic Ship Detection and Recognition in High-Resolution Gaofen-3 Spaceborne SAR Images

Author(s):  
Xiyue Hou ◽  
Wei Ao ◽  
Feng Xu
2020 ◽  
Vol 17 (2) ◽  
pp. 247-251 ◽  
Author(s):  
Huiping Lin ◽  
Hang Chen ◽  
Kan Jin ◽  
Liang Zeng ◽  
Jian Yang

2014 ◽  
Vol 1044-1045 ◽  
pp. 1040-1044
Author(s):  
Qing Ping Wang ◽  
Hong Zhu ◽  
Wei Wei Wu ◽  
Chang Zhu ◽  
Nai Chang Yuan

An improved algorithm for ship detection from the high-resolution synthetic aperture radar (SAR) images is proposed in this paper. In this algorithm, we firstly utilize the image pre-processing step to suppress the speckle noise. Then, the ship ROIs (Region of Interest) are obtained based on MSER (Maximally Stable Extremal Region) method, which enables preliminary extraction of ship candidates. Finally, an improved CFAR (Constant False Alarm Rate) detector is designed for accurate detection with the purpose of accelerating the whole process and decreasing false alarms. The experimental results show that this method can achieve effective ship detection in high-resolution SAR images. The process of ship detection is also accelerated which is in favour of the project realization.


2021 ◽  
Vol 2021 ◽  
pp. 1-19
Author(s):  
Yao Chen ◽  
Tao Duan ◽  
Changyuan Wang ◽  
Yuanyuan Zhang ◽  
Mo Huang

Ship detection on synthetic aperture radar (SAR) imagery has many valuable applications for both civil and military fields and has received extraordinary attention in recent years. The traditional detection methods are insensitive to multiscale ships and usually time-consuming, results in low detection accuracy and limitation for real-time processing. To balance the accuracy and speed, an end-to-end ship detection method for complex inshore and offshore scenes based on deep convolutional neural networks (CNNs) is proposed in this paper. First, the SAR images are divided into different grids, and the anchor boxes are predefined based on the responsible grids for dense ship prediction. Then, Darknet-53 with residual units is adopted as a backbone to extract features, and a top-down pyramid structure is added for multiscale feature fusion with concatenation. By this means, abundant hierarchical features containing both spatial and semantic information are extracted. Meanwhile, the strategies such as soft non-maximum suppression (Soft-NMS), mix-up and mosaic data augmentation, multiscale training, and hybrid optimization are used for performance enhancement. Besides, the model is trained from scratch to avoid learning objective bias of pretraining. The proposed one-stage method adopts end-to-end inference by a single network, so the detection speed can be guaranteed due to the concise paradigm. Extensive experiments are performed on the public SAR ship detection dataset (SSDD), and the results show that the method can detect both inshore and offshore ships with higher accuracy than other mainstream methods, yielding the accuracy with an average of 95.52%, and the detection speed is quite fast with about 72 frames per second (FPS). The actual Sentinel-1 and Gaofen-3 data are utilized for verification, and the detection results also show the effectiveness and robustness of the method.


Author(s):  
Ruochen Wu

Synthetic Aperture Radar (SAR) is an active type of microwave remote sensing. Using the microwave imaging system, remote sensing monitoring of the land and global ocean can be done in any weather conditions around the clock. Detection of SAR image targets is one of the main needs of radar image interpretation applications. In this paper, an improved two-parameter CFAR algorithm based on Rayleigh distribution and morphological processing is proposed to perform ship detection and recognition in high resolution SAR images. Through simulation experiments, comprehensive study of the two algorithms for high resolution SAR image target detection is achieved. Finally, the results of ship detection experiments are compared and analyzed, and the effects of detection are evaluated according to the Rayleigh distribution model and algorithms.


Sign in / Sign up

Export Citation Format

Share Document