Performance Assessment of Edge Preserving Filters

Author(s):  
Kamireddy Rasool Reddy ◽  
Madhava Rao Ch ◽  
Nagi Reddy Kalikiri

Denoising is one of the important aspects in image processing applications. Denoising is the process of eliminating the noise from the noisy image. In most cases, noise accumulates at the edges. So that prevention of noise at edges is one of the most prominent problem. There are numerous edge preserving approaches available to reduce the noise at edges in that Gaussian filter, bilateral filter and non-local means filtering are the popular approaches but in these approaches denoised image suffer from blurring. To overcome these problems, in this article a Gaussian/bilateral filtering (G/BF) with a wavelet thresholding approach is proposed for better image denoising. The performance of the proposed work is compared with some edge-preserving filter algorithms such as a bilateral filter and the Non-Local Means Filter, in terms that objectively assess quality. From the simulation results, it is found that the performance of proposed method is superior to the bilateral filter and the Non-Local Means Filter.

2018 ◽  
Vol 8 (10) ◽  
pp. 1985 ◽  
Author(s):  
Yoshihiro Maeda ◽  
Norishige Fukushima ◽  
Hiroshi Matsuo

In this paper, we propose acceleration methods for edge-preserving filtering. The filters natively include denormalized numbers, which are defined in IEEE Standard 754. The processing of the denormalized numbers has a higher computational cost than normal numbers; thus, the computational performance of edge-preserving filtering is severely diminished. We propose approaches to prevent the occurrence of the denormalized numbers for acceleration. Moreover, we verify an effective vectorization of the edge-preserving filtering based on changes in microarchitectures of central processing units by carefully treating kernel weights. The experimental results show that the proposed methods are up to five-times faster than the straightforward implementation of bilateral filtering and non-local means filtering, while the filters maintain the high accuracy. In addition, we showed effective vectorization for each central processing unit microarchitecture. The implementation of the bilateral filter is up to 14-times faster than that of OpenCV. The proposed methods and the vectorization are practical for real-time tasks such as image editing.


2018 ◽  
Vol 27 (3) ◽  
pp. 1462-1474 ◽  
Author(s):  
Christina Karam ◽  
Keigo Hirakawa

Author(s):  
Susant Kumar Panigrahi ◽  
Supratim Gupta

Thresholding of Curvelet Coefficients, for image denoising, drains out subtle signal component in noise subspace. In effect, it also produces ringing artifacts near edges. We found that the noise sensitivity of Curvelet phases — in contrast to their magnitude — reduces with higher noise level. Thus, we preserved the phase of the coefficients below threshold at coarser scale and estimated the corresponding magnitude by Joint Bilateral Filtering (JBF) technique. In contrast to the traditional hard thresholding, the coefficients in the finest scale is estimated using Bilateral Filtering (BF). The proposed filtering approach in the finest scale exhibits better connectedness among the edges, while removing the granular artifacts in the denoised image due to hard thresholding. Finally, the use of Guided Image Filter (GIF) on the Curvelet-based reconstructed image (initial denoised image in spatial domain) ensures the preservation of small image information with sharper edges and textures detail in the final denoised image. The lower noise sensitivity of Curvelet phase at higher noise strength accelerates the performance of proposed method over several state-of-the-art techniques and provides comparable outcome at lower noise levels.


2021 ◽  
Vol 9 (3) ◽  
pp. 174-179
Author(s):  
Maya Fitria ◽  
Cosmin Adrian Morariu ◽  
Josef Pauli ◽  
Ramzi Adriman

It is necessary to conserve important information, like edges, details, and textures, in CT aortic dissection images, as this helps the radiologist examine and diagnose the disease. Hence, a less noisy image is required to support medical experts in performing better diagnoses. In this work, the non-local means (NLM) method is conducted to minimize the noise in CT images of aortic dissection patients as a preprocessing step to produce accurate aortic segmentation results. The method is implemented in an existing segmentation system using six different kernel functions, and the evaluation is done by assessing DSC, precision, and recall of segmentation results. Furthermore, the visual quality of denoised images is also taken into account to be determined. Besides, a comparative analysis between NLM and other denoising methods is done in this experiment. The results showed that NLM yields encouraging segmentation results, even though the visualization of denoised images is unacceptable. Applying the NLM algorithm with the flat function provides the highest DSC, precision, and recall values of 0.937101, 0.954835, and 0.920517 consecutively.


Sign in / Sign up

Export Citation Format

Share Document