Multi-scale saliency measure and orthogonal space for visible and infrared image fusion

2021 ◽  
Vol 118 ◽  
pp. 103916
Author(s):  
Yaochen Liu ◽  
Lili Dong ◽  
Wei Ren ◽  
Wenhai Xu
2016 ◽  
Vol 66 (3) ◽  
pp. 266 ◽  
Author(s):  
Sandhya Kumari Teku ◽  
S. Koteswara Rao ◽  
I. Santhi Prabha

<p>Multi-modal image fusion objective is to combine complementary information obtained from multiple modalities into a single representation with increased reliability and interpretation. The images obtained from low-light visible cameras containing fine details of the scene and infrared cameras with high contrast details are the two modalities considered for fusion. In this paper, the low-light images with low target contrast are enhanced by using the phenomenon of stochastic resonance prior to fusion. Entropy is used as a measure to tune iteratively the coefficients using bistable system parameters. The combined advantage of multi scale decomposition approach and principal component analysis is utilized for the fusion of enhanced low-light visible and infrared images. Experimental results were carried out on different image datasets and analysis of the proposed methods were discussed. </p>


2020 ◽  
Vol 13 (6) ◽  
pp. 1-10
Author(s):  
ZHOU Wen-zhou ◽  
◽  
FAN Chen ◽  
HU Xiao-ping ◽  
HE Xiao-feng ◽  
...  

Author(s):  
Liu Xian-Hong ◽  
Chen Zhi-Bin

Background: A multi-scale multidirectional image fusion method is proposed, which introduces the Nonsubsampled Directional Filter Bank (NSDFB) into the multi-scale edge-preserving decomposition based on the fast guided filter. Methods: The proposed method has the advantages of preserving edges and extracting directional information simultaneously. In order to get better-fused sub-bands coefficients, a Convolutional Sparse Representation (CSR) based approximation sub-bands fusion rule is introduced and a Pulse Coupled Neural Network (PCNN) based detail sub-bands fusion strategy with New Sum of Modified Laplacian (NSML) to be the external input is also presented simultaneously. Results: Experimental results have demonstrated the superiority of the proposed method over conventional methods in terms of visual effects and objective evaluations. Conclusion: In this paper, combining fast guided filter and nonsubsampled directional filter bank, a multi-scale directional edge-preserving filter image fusion method is proposed. The proposed method has the features of edge-preserving and extracting directional information.


2021 ◽  
pp. 1-1
Author(s):  
Jun Chen ◽  
Xuejiao Li ◽  
Linbo Luo ◽  
Jiayi Ma

Sign in / Sign up

Export Citation Format

Share Document