Infrared and Visible Image Fusion Combining Pulse-Coupled Neural Network and Guided Filtering

2019 ◽  
Vol 39 (11) ◽  
pp. 1110003
Author(s):  
周哓玲 Zhou XiaoLing ◽  
江泽涛 Jiang Zetao
2020 ◽  
Vol 57 (20) ◽  
pp. 201007
Author(s):  
沈瑜 Shen Yu ◽  
陈小朋 Chen Xiaopeng ◽  
苑玉彬 Yuan Yubin ◽  
王霖 Wang Lin ◽  
张泓国 Zhang Hongguo

IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 98290-98305 ◽  
Author(s):  
Li Yin ◽  
Mingyao Zheng ◽  
Guanqiu Qi ◽  
Zhiqin Zhu ◽  
Fu Jin ◽  
...  

Sensors ◽  
2020 ◽  
Vol 20 (10) ◽  
pp. 2764 ◽  
Author(s):  
Xiaojun Li ◽  
Haowen Yan ◽  
Weiying Xie ◽  
Lu Kang ◽  
Yi Tian

Pulse-coupled neural network (PCNN) and its modified models are suitable for dealing with multi-focus and medical image fusion tasks. Unfortunately, PCNNs are difficult to directly apply to multispectral image fusion, especially when the spectral fidelity is considered. A key problem is that most fusion methods using PCNNs usually focus on the selection mechanism either in the space domain or in the transform domain, rather than a details injection mechanism, which is of utmost importance in multispectral image fusion. Thus, a novel pansharpening PCNN model for multispectral image fusion is proposed. The new model is designed to acquire the spectral fidelity in terms of human visual perception for the fusion tasks. The experimental results, examined by different kinds of datasets, show the suitability of the proposed model for pansharpening.


Sign in / Sign up

Export Citation Format

Share Document