scholarly journals On posterior distribution of Bayesian wavelet thresholding

2011 ◽  
Vol 141 (1) ◽  
pp. 318-324 ◽  
Author(s):  
Heng Lian
2009 ◽  
Vol 2009 ◽  
pp. 1-10 ◽  
Author(s):  
Nawrès Khlifa ◽  
Najla Gribaa ◽  
Imen Mbazaa ◽  
Kamel Hamruoni

Nuclear images are very often used to study the functionality of some organs. Unfortunately, these images have bad contrast, a weak resolution, and present fluctuations due to the radioactivity disintegration. To enhance their quality, physicians have to increase the quantity of the injected radioactive material and the acquisition time. In this paper, we propose an alternative solution. It consists in a software framework that enhances nuclear image quality and reduces statistical fluctuations. Since these images are modeled as the realization of a Poisson process, we propose a new framework that performs variance stabilizing of the Poisson process before applying an adapted Bayesian wavelet shrinkage. The proposed method has been applied on real images, and it has proved its performance.


Geophysics ◽  
2015 ◽  
Vol 80 (2) ◽  
pp. M15-M31 ◽  
Author(s):  
Astrid K. Lundsgaard ◽  
Hanno Klemm ◽  
Adam J. Cherrett

We addressed the problem of the well-to-seismic tie as a Bayesian inversion for the wavelet and well path in the impedance domain. The result of the joint inversion is a set of wavelets for multiple angle stacks, and a corresponding well path. The wavelet optimally links the impedance data along the well to the seismic data along the optimized well path in the seismic time domain. Starting with prior distribution for wavelet and well path, the method calculates the posterior distribution of conditioning the prior distributions with the seismic and well-log data. This is done by iteratively inverting the seismic data with the current wavelet, to obtain an impedance cube around the well. In a second step, the seismic impedances are projected onto the well path. By minimizing the misfit between the inverted seismic impedances and the impedances derived from the well log, the wavelet and well path are optimized. Comparing the well and seismic data in the impedance domain enables the method to work on short and noisy well logs. Another advantage of this method is its ability to derive wavelets for multiple angle stacks and multiple well locations simultaneously. We tested the method on synthetic and real data examples. The algorithm performed well in the synthetic examples, in which we had control over the modeling wavelet, and the wavelets derived for a real data example showed consistently good seismic-to-well ties for six angle stacks and seven wells. The main algorithm we developed was aimed to linearize the problem. We compared the posterior distribution of the linearized result with a sampling-based result in a real data example and found good agreement.


Author(s):  
Cecilia Viscardi ◽  
Michele Boreale ◽  
Fabio Corradi

AbstractWe consider the problem of sample degeneracy in Approximate Bayesian Computation. It arises when proposed values of the parameters, once given as input to the generative model, rarely lead to simulations resembling the observed data and are hence discarded. Such “poor” parameter proposals do not contribute at all to the representation of the parameter’s posterior distribution. This leads to a very large number of required simulations and/or a waste of computational resources, as well as to distortions in the computed posterior distribution. To mitigate this problem, we propose an algorithm, referred to as the Large Deviations Weighted Approximate Bayesian Computation algorithm, where, via Sanov’s Theorem, strictly positive weights are computed for all proposed parameters, thus avoiding the rejection step altogether. In order to derive a computable asymptotic approximation from Sanov’s result, we adopt the information theoretic “method of types” formulation of the method of Large Deviations, thus restricting our attention to models for i.i.d. discrete random variables. Finally, we experimentally evaluate our method through a proof-of-concept implementation.


2020 ◽  
pp. 1-11
Author(s):  
Hui Wang ◽  
Huang Shiwang

The various parts of the traditional financial supervision and management system can no longer meet the current needs, and further improvement is urgently needed. In this paper, the low-frequency data is regarded as the missing of the high-frequency data, and the mixed frequency VAR model is adopted. In order to overcome the problems caused by too many parameters of the VAR model, this paper adopts the Bayesian estimation method based on the Minnesota prior to obtain the posterior distribution of each parameter of the VAR model. Moreover, this paper uses methods based on Kalman filtering and Kalman smoothing to obtain the posterior distribution of latent state variables. Then, according to the posterior distribution of the VAR model parameters and the posterior distribution of the latent state variables, this paper uses the Gibbs sampling method to obtain the mixed Bayes vector autoregressive model and the estimation of the state variables. Finally, this article studies the influence of Internet finance on monetary policy with examples. The research results show that the method proposed in this article has a certain effect.


Sign in / Sign up

Export Citation Format

Share Document