scholarly journals Spectral Image Fusion From Compressive Measurements Using Spectral Unmixing and a Sparse Representation of Abundance Maps

2019 ◽  
Vol 57 (7) ◽  
pp. 5043-5053 ◽  
Author(s):  
Edwin Vargas ◽  
Henry Arguello ◽  
Jean-Yves Tourneret
Tecnura ◽  
2020 ◽  
Vol 24 (66) ◽  
pp. 62-75
Author(s):  
Edwin Vargas ◽  
Kevin Arias ◽  
Fernando Rojas ◽  
Henry Arguello

Objective: Hyperspectral (HS) imaging systems are commonly used in a diverse range of applications that involve detection and classification tasks. However, the low spatial resolution of hyperspectral images may limit the performance of the involved tasks in such applications. In the last years, fusing the information of an HS image with high spatial resolution multispectral (MS) or panchromatic (PAN) images has been widely studied to enhance the spatial resolution. Image fusion has been formulated as an inverse problem whose solution is an HS image which assumed to be sparse in an analytic or learned dictionary. This work proposes a non-local centralized sparse representation model on a set of learned dictionaries in order to regularize the conventional fusion problem.Methodology: The dictionaries are learned from the estimated abundance data taking advantage of the depth correlation between abundance maps and the non-local self- similarity over the spatial domain. Then, conditionally on these dictionaries, the fusion problem is solved by an alternating iterative numerical algorithm.Results: Experimental results with real data show that the proposed method outperforms the state-of-the-art methods under different quantitative assessments.Conclusions: In this work, we propose a hyperspectral and multispectral image fusion method based on a non-local centralized sparse representation on abundance maps. This model allows us to include the non-local redundancy of abundance maps in the fusion problem using spectral unmixing and improve the performance of the sparsity-based fusion approaches.


Author(s):  
Liu Xian-Hong ◽  
Chen Zhi-Bin

Background: A multi-scale multidirectional image fusion method is proposed, which introduces the Nonsubsampled Directional Filter Bank (NSDFB) into the multi-scale edge-preserving decomposition based on the fast guided filter. Methods: The proposed method has the advantages of preserving edges and extracting directional information simultaneously. In order to get better-fused sub-bands coefficients, a Convolutional Sparse Representation (CSR) based approximation sub-bands fusion rule is introduced and a Pulse Coupled Neural Network (PCNN) based detail sub-bands fusion strategy with New Sum of Modified Laplacian (NSML) to be the external input is also presented simultaneously. Results: Experimental results have demonstrated the superiority of the proposed method over conventional methods in terms of visual effects and objective evaluations. Conclusion: In this paper, combining fast guided filter and nonsubsampled directional filter bank, a multi-scale directional edge-preserving filter image fusion method is proposed. The proposed method has the features of edge-preserving and extracting directional information.


2021 ◽  
Vol 25 (6) ◽  
pp. 4393-4407
Author(s):  
Qiu Hu ◽  
Shaohai Hu ◽  
Fengzhen Zhang

2021 ◽  
Vol 224 ◽  
pp. 107087
Author(s):  
Xiaosong Li ◽  
Fuqiang Zhou ◽  
Haishu Tan

2014 ◽  
Author(s):  
Yoonsuk Choi ◽  
Ershad Sharifahmadian ◽  
Shahram Latifi
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document