Thin cloud removal in optical remote sensing images based on generative adversarial networks and physical model of cloud distortion

2020 ◽  
Vol 166 ◽  
pp. 373-389 ◽  
Author(s):  
Jun Li ◽  
Zhaocong Wu ◽  
Zhongwen Hu ◽  
Jiaqi Zhang ◽  
Mingliang Li ◽  
...  
2014 ◽  
Vol 96 ◽  
pp. 224-235 ◽  
Author(s):  
Huanfeng Shen ◽  
Huifang Li ◽  
Yan Qian ◽  
Liangpei Zhang ◽  
Qiangqiang Yuan

2020 ◽  
Vol 12 (24) ◽  
pp. 4162
Author(s):  
Anna Hu ◽  
Zhong Xie ◽  
Yongyang Xu ◽  
Mingyu Xie ◽  
Liang Wu ◽  
...  

One major limitation of remote-sensing images is bad weather conditions, such as haze. Haze significantly reduces the accuracy of satellite image interpretation. To solve this problem, this paper proposes a novel unsupervised method to remove haze from high-resolution optical remote-sensing images. The proposed method, based on cycle generative adversarial networks, is called the edge-sharpening cycle-consistent adversarial network (ES-CCGAN). Most importantly, unlike existing methods, this approach does not require prior information; the training data are unsupervised, which mitigates the pressure of preparing the training data set. To enhance the ability to extract ground-object information, the generative network replaces a residual neural network (ResNet) with a dense convolutional network (DenseNet). The edge-sharpening loss function of the deep-learning model is designed to recover clear ground-object edges and obtain more detailed information from hazy images. In the high-frequency information extraction model, this study re-trained the Visual Geometry Group (VGG) network using remote-sensing images. Experimental results reveal that the proposed method can recover different kinds of scenes from hazy images successfully and obtain excellent color consistency. Moreover, the ability of the proposed method to obtain clear edges and rich texture feature information makes it superior to the existing methods.


Sign in / Sign up

Export Citation Format

Share Document