A Method of Visibility Detection Based on the Transfer Learning
AbstractAtmospheric visibility is an important element of meteorological observation. With existing methods, defining image features that reflect visibility accurately and comprehensively is difficult. This paper proposes a visibility detection method based on transfer learning using deep convolutional neural networks (DCNN) that addresses issues caused by a lack of sufficient visibility labeled datasets. In the proposed method, each image was first divided into several subregions, which were encoded to extract visual features using a pretrained no-reference image quality assessment neural network. Then a support vector regression model was trained to map the extracted features to the visibility. The fusion weight of each subregion was evaluated according to the error analysis of the regression model. Finally, the neural network was fine-tuned to better fit the problem of visibility detection using the current detection results conversely. Experimental results demonstrated that the detection accuracy of the proposed method exceeds 90% and satisfies the requirements of daily observation applications.