scholarly journals EPLL: An Image Denoising Method Using a Gaussian Mixture Model Learned on a Large Set of Patches

2018 ◽  
Vol 8 ◽  
pp. 465-489 ◽  
Author(s):  
Samuel Hurault ◽  
Thibaud Ehret ◽  
Pablo Arias
2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Hui Wei ◽  
Wei Zheng

An image denoising method is proposed based on the improved Gaussian mixture model to reduce the noises and enhance the image quality. Unlike the traditional image denoising methods, the proposed method models the pixel information in the neighborhood around each pixel in the image. The Gaussian mixture model is employed to measure the similarity between pixels by calculating the L2 norm between the Gaussian mixture models corresponding to the two pixels. The Gaussian mixture model can model the statistical information such as the mean and variance of the pixel information in the image area. The L2 norm between the two Gaussian mixture models represents the difference in the local grayscale intensity and the richness of the details of the pixel information around the two pixels. In this sense, the L2 norm between Gaussian mixture models can more accurately measure the similarity between pixels. The experimental results show that the proposed method can improve the denoising performance of the images while retaining the detailed information of the image.


2018 ◽  
Vol 2018 ◽  
pp. 1-9
Author(s):  
Aidong Xu ◽  
Wenqi Huang ◽  
Peng Li ◽  
Huajun Chen ◽  
Jiaxiao Meng ◽  
...  

Aiming at improving noise reduction effect for mechanical vibration signal, a Gaussian mixture model (SGMM) and a quantum-inspired standard deviation (QSD) are proposed and applied to the denoising method using the thresholding function in wavelet domain. Firstly, the SGMM is presented and utilized as a local distribution to approximate the wavelet coefficients distribution in each subband. Then, within Bayesian framework, the maximum a posteriori (MAP) estimator is employed to derive a thresholding function with conventional standard deviation (CSD) which is calculated by the expectation-maximization (EM) algorithm. However, the CSD has a disadvantage of ignoring the interscale dependency between wavelet coefficients. Considering this limit for the CSD, the quantum theory is adopted to analyze the interscale dependency between coefficients in adjacent subbands, and the QSD for noise-free wavelet coefficients is presented based on quantum mechanics. Next, the QSD is constituted for the CSD in the thresholding function to shrink noisy coefficients. Finally, an application in the mechanical vibration signal processing is used to illustrate the denoising technique. The experimental study shows the SGMM can model the distribution of wavelet coefficients accurately and QSD can depict interscale dependency of wavelet coefficients of true signal quite successfully. Therefore, the denoising method utilizing the SGMM and QSD performs better than others.


2021 ◽  
Author(s):  
Guohua Gao ◽  
Jeroen Vink ◽  
Fredrik Saaf ◽  
Terence Wells

Abstract When formulating history matching within the Bayesian framework, we may quantify the uncertainty of model parameters and production forecasts using conditional realizations sampled from the posterior probability density function (PDF). It is quite challenging to sample such a posterior PDF. Some methods e.g., Markov chain Monte Carlo (MCMC), are very expensive (e.g., MCMC) while others are cheaper but may generate biased samples. In this paper, we propose an unconstrained Gaussian Mixture Model (GMM) fitting method to approximate the posterior PDF and investigate new strategies to further enhance its performance. To reduce the CPU time of handling bound constraints, we reformulate the GMM fitting formulation such that an unconstrained optimization algorithm can be applied to find the optimal solution of unknown GMM parameters. To obtain a sufficiently accurate GMM approximation with the lowest number of Gaussian components, we generate random initial guesses, remove components with very small or very large mixture weights after each GMM fitting iteration and prevent their reappearance using a dedicated filter. To prevent overfitting, we only add a new Gaussian component if the quality of the GMM approximation on a (large) set of blind-test data sufficiently improves. The unconstrained GMM fitting method with the new strategies proposed in this paper is validated using nonlinear toy problems and then applied to a synthetic history matching example. It can construct a GMM approximation of the posterior PDF that is comparable to the MCMC method, and it is significantly more efficient than the constrained GMM fitting formulation, e.g., reducing the CPU time by a factor of 800 to 7300 for problems we tested, which makes it quite attractive for large scale history matching problems.


2018 ◽  
Vol 11 (4) ◽  
pp. 2568-2609 ◽  
Author(s):  
Charles-Alban Deledalle ◽  
Shibin Parameswaran ◽  
Truong Q. Nguyen

SPE Journal ◽  
2021 ◽  
pp. 1-20
Author(s):  
Guohua Gao ◽  
Jeroen Vink ◽  
Fredrik Saaf ◽  
Terence Wells

Summary When formulating history matching within the Bayesian framework, we may quantify the uncertainty of model parameters and production forecasts using conditional realizations sampled from the posterior probability density function (PDF). It is quite challenging to sample such a posterior PDF. Some methods [e.g., Markov chain Monte Carlo (MCMC)] are very expensive, whereas other methods are cheaper but may generate biased samples. In this paper, we propose an unconstrained Gaussian mixture model (GMM) fitting method to approximate the posterior PDF and investigate new strategies to further enhance its performance. To reduce the central processing unit (CPU) time of handling bound constraints, we reformulate the GMM fitting formulation such that an unconstrained optimization algorithm can be applied to find the optimal solution of unknown GMM parameters. To obtain a sufficiently accurate GMM approximation with the lowest number of Gaussian components, we generate random initial guesses, remove components with very small or very large mixture weights after each GMM fitting iteration, and prevent their reappearance using a dedicated filter. To prevent overfitting, we add a new Gaussian component only if the quality of the GMM approximation on a (large) set of blind-test data sufficiently improves. The unconstrained GMM fitting method with the new strategies proposed in this paper is validated using nonlinear toy problems and then applied to a synthetic history-matching example. It can construct a GMM approximation of the posterior PDF that is comparable to the MCMC method, and it is significantly more efficient than the constrained GMM fitting formulation (e.g., reducing the CPU time by a factor of 800 to 7,300 for problems we tested), which makes it quite attractive for large-scalehistory-matchingproblems. NOTE: This paper is published as part of the 2021 SPE Reservoir Simulation Special Issue.


Sign in / Sign up

Export Citation Format

Share Document