Three-dimensional geometrical feature estimation for ship classification through SAR images

Author(s):  
Chongwen Duan ◽  
Weidong Hu ◽  
Xiaoyong Du
2014 ◽  
Vol 2 (12) ◽  
pp. 7383-7408
Author(s):  
W. Liu ◽  
F. Yamazaki ◽  
M. Matsuoka ◽  
T. Nonaka ◽  
T. Sasagawa

Abstract. The Tohoku-Oki earthquake on 11 March 2011 caused significant widespread crustal movements. In a previous study, we proposed a method for capturing two-dimensional (2-D) surface displacements from a pair of pre- and post-event TerraSAR-X (TSX) intensity images. However, it is difficult to detect three-dimensional (3-D) displacements from one pair of TSX images. In this study, three pairs of pre- and post-event TSX images taken on different paths were used to estimate 3-D crustal movements. The relationship between the actual 3-D displacements and the converted 2-D movements in the SAR images was derived based on the observation model of a SAR sensor. The 3-D movements were then calculated from three sets of detected 2-D movements that occurred within a short time period. Compared with GPS observations, the proposed method was found to be capable of detecting the 3-D crustal movements with sub-pixel accuracy.


Sensors ◽  
2018 ◽  
Vol 18 (9) ◽  
pp. 2929 ◽  
Author(s):  
Yuanyuan Wang ◽  
Chao Wang ◽  
Hong Zhang

With the capability to automatically learn discriminative features, deep learning has experienced great success in natural images but has rarely been explored for ship classification in high-resolution SAR images due to the training bottleneck caused by the small datasets. In this paper, convolutional neural networks (CNNs) are applied to ship classification by using SAR images with the small datasets. First, ship chips are constructed from high-resolution SAR images and split into training and validation datasets. Second, a ship classification model is constructed based on very deep convolutional networks (VGG). Then, VGG is pretrained via ImageNet, and fine tuning is utilized to train our model. Six scenes of COSMO-SkyMed images are used to evaluate our proposed model with regard to the classification accuracy. The experimental results reveal that (1) our proposed ship classification model trained by fine tuning achieves more than 95% average classification accuracy, even with 5-cross validation; (2) compared with other models, the ship classification model based on VGG16 achieves at least 2% higher accuracies for classification. These experimental results reveal the effectiveness of our proposed method.


2020 ◽  
Vol 57 (16) ◽  
pp. 161022
Author(s):  
任永梅 Ren Yongmei ◽  
杨杰 Yang Jie ◽  
郭志强 Guo Zhiqiang ◽  
陈奕蕾 Chen Yilei

2018 ◽  
Vol 47 (4) ◽  
pp. 551-562 ◽  
Author(s):  
Foroogh Sharifzadeh ◽  
Gholamreza Akbarizadeh ◽  
Yousef Seifi Kavian

Sensors ◽  
2019 ◽  
Vol 19 (23) ◽  
pp. 5062
Author(s):  
Liu ◽  
Xiao

To determine the geolocation of a pixel for spaceborne synthetic aperture radar (SAR) images, traditional indirect geolocation methods can cause great computational complexity. In this paper, a fast, three-dimensional, indirect geolocation method without ground control points (GCPs) is presented. First, the Range-Doppler (RD) geolocation model with all the equations in the Earth-centered rotating (ECR) coordinate system is introduced. By using an iterative analytical geolocation method (IAGM), the corner point locations of a quadrangle SAR image on the Earth’s surface are obtained. Then, a three-dimensional (3D) grid can be built by utilizing the digital surface model (DSM) data in this quadrangle. Through the proportional relationship for every pixel in the 3D grid, the azimuth time can be estimated, which is the key to decreasing the calculation time of the Doppler centroid. The results show that the proposed method is about 12 times faster than the traditional method, and that it maintains geolocation accuracy. After acquiring the precise azimuth time, it is easy to obtain the range location. Therefore, the spaceborne SAR image can be geolocated to the Earth surface precisely based on the high-resolution DSM data.


Sign in / Sign up

Export Citation Format

Share Document