On designing approximate inference algorithms for multiply sectioned Bayesian networks

Author(s):  
Karen H. Jin ◽  
Dan Wu ◽  
Libing Wu
Author(s):  
ANDRÉS CANO ◽  
MANUEL GÓMEZ-OLMEDO ◽  
CORA B. PÉREZ-ARIZA ◽  
ANTONIO SALMERÓN

We present an efficient procedure for factorising probabilistic potentials represented as probability trees. This new procedure is able to detect some regularities that cannot be captured by existing methods. In cases where an exact decomposition is not achievable, we propose a heuristic way to carry out approximate factorisations guided by a parameter called factorisation degree, which is fast to compute. We show how this parameter can be used to control the tradeoff between complexity and accuracy in approximate inference algorithms for Bayesian networks.


2011 ◽  
Vol 52 (1) ◽  
pp. 49-62 ◽  
Author(s):  
Andrés Cano ◽  
Manuel Gémez-Olmedo ◽  
Serafén Moral

2019 ◽  
Vol 9 (10) ◽  
pp. 2055 ◽  
Author(s):  
Cheol Young Park ◽  
Kathryn Blackmond Laskey ◽  
Paulo C. G. Costa ◽  
Shou Matsumoto

Hybrid Bayesian Networks (HBNs), which contain both discrete and continuous variables, arise naturally in many application areas (e.g., image understanding, data fusion, medical diagnosis, fraud detection). This paper concerns inference in an important subclass of HBNs, the conditional Gaussian (CG) networks, in which all continuous random variables have Gaussian distributions and all children of continuous random variables must be continuous. Inference in CG networks can be NP-hard even for special-case structures, such as poly-trees, where inference in discrete Bayesian networks can be performed in polynomial time. Therefore, approximate inference is required. In approximate inference, it is often necessary to trade off accuracy against solution time. This paper presents an extension to the Hybrid Message Passing inference algorithm for general CG networks and an algorithm for optimizing its accuracy given a bound on computation time. The extended algorithm uses Gaussian mixture reduction to prevent an exponential increase in the number of Gaussian mixture components. The trade-off algorithm performs pre-processing to find optimal run-time settings for the extended algorithm. Experimental results for four CG networks compare performance of the extended algorithm with existing algorithms and show the optimal settings for these CG networks.


2008 ◽  
Vol 32 ◽  
pp. 879-900 ◽  
Author(s):  
Y. Wang ◽  
N. L. Zhang ◽  
T. Chen

We propose a novel method for approximate inference in Bayesian networks (BNs). The idea is to sample data from a BN, learn a latent tree model (LTM) from the data offline, and when online, make inference with the LTM instead of the original BN. Because LTMs are tree-structured, inference takes linear time. In the meantime, they can represent complex relationship among leaf nodes and hence the approximation accuracy is often good. Empirical evidence shows that our method can achieve good approximation accuracy at low online computational cost.


Sign in / Sign up

Export Citation Format

Share Document