Complex System Testability Analysis Based on Bayesian Networks under Small Sample

2014 ◽  
Vol 602-605 ◽  
pp. 1772-1777
Author(s):  
Xi Shan Zhang ◽  
Kao Li Huang ◽  
Peng Cheng Yan ◽  
Guang Yao Lian

A lot of prior information in complex system test has been accumulated. To use the prior information for complex system testability quantitative analysis, a new complex system testability modeling and analyze method based on Bayesian network is presented. First, the complex system’s testability model is built using various kind of prior information by Bayesian network learning algorithm. Then, the way of assessing the testability of complex system is provided using the inference algorithm of Bayesian network. Finally, some proper examples are provided to prove the method’s validity.

Author(s):  
Amel Alhussan ◽  
Khalil El Hindi

In this work, we propose a Selective Fine-Tuning algorithm for Bayesian Networks (SFTBN). The aim is to enhance the accuracy of Bayesian Network (BN) classifiers by finding better estimations for the probability terms used by the classifiers. The algorithm augments a BN learning algorithm with a fine-tuning stage that aims to more accurately estimate the probability terms used by the BN. If the value of a probability term causes a misclassification of a training instances and falls outside its valid range then we update (fine-tune) that value. The amount of such an update is proportional to the distance between the value and its valid range. We use the algorithm to fine-tune several forms of BNs: the Naive Bayes (NB), Tree Augmented Naive Bayes (TAN), and Bayesian Augmented Naive Bayes (BAN) models. Our empirical experiments indicate that the SFTBN algorithm improves the classification accuracy of BN classifiers. We also generalized the original fine-tuning algorithm of Naive Bayesian (FTNB) for BN models. We empirically compare the two algorithms, and the empirical results show that while FTNB is more accurate than SFTBN for fine-tuning NB classifiers, SFTBN is more accurate for fine-tuning BNs than the adapted version of FTNB.


2013 ◽  
Vol 427-429 ◽  
pp. 1614-1619
Author(s):  
Shao Jin Han ◽  
Jian Xun Li

The traditional structure learning algorithms are mainly faced with a large sample dataset. But the sample dataset practically is small. Based on it, we introduce the Probability Density Kernel Estimation (PDKE), which would achieve the expansion of the original sample sets. Then, the K2 algorithm is used to learn the Bayesian network structure. By optimizing the kernel function and window width, PDKE achieves the effective expansion of the original dataset. After the confirm of variable order based on mutual information, a small sample set of Bayesian structure learning algorithm would be established. Finally, simulation results confirm that the new algorithm is effective and practical.


2019 ◽  
Vol 31 (6) ◽  
pp. 1183-1214 ◽  
Author(s):  
Suwa Xu ◽  
Bochao Jia ◽  
Faming Liang

Bayesian networks have been widely used in many scientific fields for describing the conditional independence relationships for a large set of random variables. This letter proposes a novel algorithm, the so-called p-learning algorithm, for learning moral graphs for high-dimensional Bayesian networks. The moral graph is a Markov network representation of the Bayesian network and also the key to construction of the Bayesian network for constraint-based algorithms. The consistency of the p-learning algorithm is justified under the small- n, large- p scenario. The numerical results indicate that the p-learning algorithm significantly outperforms the existing ones, such as the PC, grow-shrink, incremental association, semi-interleaved hiton, hill-climbing, and max-min hill-climbing. Under the sparsity assumption, the p-learning algorithm has a computational complexity of O(p2) even in the worst case, while the existing algorithms have a computational complexity of O(p3) in the worst case.


Sign in / Sign up

Export Citation Format

Share Document