Edge-preserving smoothing pyramid based multi-scale exposure fusion

Author(s):  
Fei Kou ◽  
Zhengguo Li ◽  
Changyun Wen ◽  
Weihai Chen
Author(s):  
Liu Xian-Hong ◽  
Chen Zhi-Bin

Background: A multi-scale multidirectional image fusion method is proposed, which introduces the Nonsubsampled Directional Filter Bank (NSDFB) into the multi-scale edge-preserving decomposition based on the fast guided filter. Methods: The proposed method has the advantages of preserving edges and extracting directional information simultaneously. In order to get better-fused sub-bands coefficients, a Convolutional Sparse Representation (CSR) based approximation sub-bands fusion rule is introduced and a Pulse Coupled Neural Network (PCNN) based detail sub-bands fusion strategy with New Sum of Modified Laplacian (NSML) to be the external input is also presented simultaneously. Results: Experimental results have demonstrated the superiority of the proposed method over conventional methods in terms of visual effects and objective evaluations. Conclusion: In this paper, combining fast guided filter and nonsubsampled directional filter bank, a multi-scale directional edge-preserving filter image fusion method is proposed. The proposed method has the features of edge-preserving and extracting directional information.


Optik ◽  
2020 ◽  
Vol 223 ◽  
pp. 165494 ◽  
Author(s):  
Yadong Xu ◽  
Beibei Sun

2018 ◽  
Vol 101 (1-4) ◽  
pp. 105-117
Author(s):  
Haiyong Chen ◽  
Yafei Ren ◽  
Junqi Cao ◽  
Weipeng Liu ◽  
Kun Liu
Keyword(s):  

2008 ◽  
Vol 27 (3) ◽  
pp. 1-10 ◽  
Author(s):  
Zeev Farbman ◽  
Raanan Fattal ◽  
Dani Lischinski ◽  
Richard Szeliski
Keyword(s):  

Sensors ◽  
2021 ◽  
Vol 22 (1) ◽  
pp. 24
Author(s):  
Yan-Tsung Peng ◽  
He-Hao Liao ◽  
Ching-Fu Chen

In contrast to conventional digital images, high-dynamic-range (HDR) images have a broader range of intensity between the darkest and brightest regions to capture more details in a scene. Such images are produced by fusing images with different exposure values (EVs) for the same scene. Most existing multi-scale exposure fusion (MEF) algorithms assume that the input images are multi-exposed with small EV intervals. However, thanks to emerging spatially multiplexed exposure technology that can capture an image pair of short and long exposure simultaneously, it is essential to deal with two-exposure image fusion. To bring out more well-exposed contents, we generate a more helpful intermediate virtual image for fusion using the proposed Optimized Adaptive Gamma Correction (OAGC) to have better contrast, saturation, and well-exposedness. Fusing the input images with the enhanced virtual image works well even though both inputs are underexposed or overexposed, which other state-of-the-art fusion methods could not handle. The experimental results show that our method performs favorably against other state-of-the-art image fusion methods in generating high-quality fusion results.


Sensors ◽  
2020 ◽  
Vol 20 (16) ◽  
pp. 4614 ◽  
Author(s):  
Yi Yang ◽  
Zhengguo Li ◽  
Shiqian Wu

Capturing high-quality images via mobile devices in low-light or backlighting conditions is very challenging. In this paper, a new, single image brightening algorithm is proposed to enhance an image captured in low-light conditions. Two virtual images with larger exposure times are generated to increase brightness and enhance fine details of the underexposed regions. In order to reduce the brightness change, the virtual images are generated via intensity mapping functions (IMFs) which are computed using available camera response functions (CRFs). To avoid possible color distortion in the virtual image due to one-to-many mapping, a least square minimization problem is formulated to determine brightening factors for all pixels in the underexposed regions. In addition, an edge-preserving smoothing technique is adopted to avoid noise in the underexposed regions from being amplified in the virtual images. The final brightened image is obtained by fusing the original image and two virtual images via a gradient domain guided image filtering (GGIF) based multiscale exposure fusion (MEF) with properly defined weights for all the images. Experimental results show that the relative brightness and color are preserved better by the proposed algorithm. The details in bright regions are also preserved well in the final image. The proposed algorithm is expected to be useful for computational photography on smart phones.


Sign in / Sign up

Export Citation Format

Share Document