Multi-scale deep adversarial network for particle detection in liquid crystal module

2021 ◽  
pp. 108326
Author(s):  
Yuanyuan Wang ◽  
Ling Ma ◽  
Lihua Jian ◽  
Chengshuai Fan ◽  
Zhipeng Zhang ◽  
...  
2020 ◽  
Vol 40 (18) ◽  
pp. 1810001
Author(s):  
贾瑞明 Jia Ruiming ◽  
李彤 Li Tong ◽  
刘圣杰 Liu Shengjie ◽  
崔家礼 Cui Jiali ◽  
袁飞 Yuan Fei

Electronics ◽  
2019 ◽  
Vol 8 (11) ◽  
pp. 1370 ◽  
Author(s):  
Tingzhu Sun ◽  
Weidong Fang ◽  
Wei Chen ◽  
Yanxin Yao ◽  
Fangming Bi ◽  
...  

Although image inpainting based on the generated adversarial network (GAN) has made great breakthroughs in accuracy and speed in recent years, they can only process low-resolution images because of memory limitations and difficulty in training. For high-resolution images, the inpainted regions become blurred and the unpleasant boundaries become visible. Based on the current advanced image generation network, we proposed a novel high-resolution image inpainting method based on multi-scale neural network. This method is a two-stage network including content reconstruction and texture detail restoration. After holding the visually believable fuzzy texture, we further restore the finer details to produce a smoother, clearer, and more coherent inpainting result. Then we propose a special application scene of image inpainting, that is, to delete the redundant pedestrians in the image and ensure the reality of background restoration. It involves pedestrian detection, identifying redundant pedestrians and filling in them with the seemingly correct content. To improve the accuracy of image inpainting in the application scene, we proposed a new mask dataset, which collected the characters in COCO dataset as a mask. Finally, we evaluated our method on COCO and VOC dataset. the experimental results show that our method can produce clearer and more coherent inpainting results, especially for high-resolution images, and the proposed mask dataset can produce better inpainting results in the special application scene.


Sign in / Sign up

Export Citation Format

Share Document