Classification of remote sensing image data fusion considering spatial information

Author(s):  
Zhaofu Wu ◽  
Fei Gao
2021 ◽  
Vol 336 ◽  
pp. 06030
Author(s):  
Fengbing Jiang ◽  
Fang Li ◽  
Guoliang Yang

Convolution neural network for remote sensing image scene classification consumes a lot of time and storage space to train, test and save the model. In this paper, firstly, elastic variables are defined for convolution layer filter, and combined with filter elasticity and batch normalization scaling factor, a compound pruning method of convolution neural network is proposed. Only the superparameter of pruning rate needs to be adjusted during training. in the process of training, the performance of the model can be improved by means of transfer learning. In this paper, algorithm tests are carried out on NWPU-RESISC45 remote sensing image data to verify the effectiveness of the proposed method. According to the experimental results, the proposed method can not only effectively reduce the number of model parameters and computation, but also ensure the accuracy of the algorithm in remote sensing image classification.


2021 ◽  
Vol 13 (4) ◽  
pp. 1917
Author(s):  
Alma Elizabeth Thuestad ◽  
Ole Risbøl ◽  
Jan Ingolf Kleppe ◽  
Stine Barlindhaug ◽  
Elin Rose Myrvoll

What can remote sensing contribute to archaeological surveying in subarctic and arctic landscapes? The pros and cons of remote sensing data vary as do areas of utilization and methodological approaches. We assessed the applicability of remote sensing for archaeological surveying of northern landscapes using airborne laser scanning (LiDAR) and satellite and aerial images to map archaeological features as a basis for (a) assessing the pros and cons of the different approaches and (b) assessing the potential detection rate of remote sensing. Interpretation of images and a LiDAR-based bare-earth digital terrain model (DTM) was based on visual analyses aided by processing and visualizing techniques. 368 features were identified in the aerial images, 437 in the satellite images and 1186 in the DTM. LiDAR yielded the better result, especially for hunting pits. Image data proved suitable for dwellings and settlement sites. Feature characteristics proved a key factor for detectability, both in LiDAR and image data. This study has shown that LiDAR and remote sensing image data are highly applicable for archaeological surveying in northern landscapes. It showed that a multi-sensor approach contributes to high detection rates. Our results have improved the inventory of archaeological sites in a non-destructive and minimally invasive manner.


2021 ◽  
Vol 13 (4) ◽  
pp. 747
Author(s):  
Yanghua Di ◽  
Zhiguo Jiang ◽  
Haopeng Zhang

Fine-grained visual categorization (FGVC) is an important and challenging problem due to large intra-class differences and small inter-class differences caused by deformation, illumination, angles, etc. Although major advances have been achieved in natural images in the past few years due to the release of popular datasets such as the CUB-200-2011, Stanford Cars and Aircraft datasets, fine-grained ship classification in remote sensing images has been rarely studied because of relative scarcity of publicly available datasets. In this paper, we investigate a large amount of remote sensing image data of sea ships and determine most common 42 categories for fine-grained visual categorization. Based our previous DSCR dataset, a dataset for ship classification in remote sensing images, we collect more remote sensing images containing warships and civilian ships of various scales from Google Earth and other popular remote sensing image datasets including DOTA, HRSC2016, NWPU VHR-10, We call our dataset FGSCR-42, meaning a dataset for Fine-Grained Ship Classification in Remote sensing images with 42 categories. The whole dataset of FGSCR-42 contains 9320 images of most common types of ships. We evaluate popular object classification algorithms and fine-grained visual categorization algorithms to build a benchmark. Our FGSCR-42 dataset is publicly available at our webpages.


Sign in / Sign up

Export Citation Format

Share Document