An Optimization Algorithm of Bayesian Network Classifiers by Derivatives of Conditional Log Likelihood

2012 ◽  
Vol 35 (2) ◽  
pp. 364-374 ◽  
Author(s):  
Zhong-Feng WANG ◽  
Zhi-Hai WANG
2021 ◽  
Vol 25 (3) ◽  
pp. 641-667
Author(s):  
Limin Wang ◽  
Sikai Qi ◽  
Yang Liu ◽  
Hua Lou ◽  
Xin Zuo

Bagging has attracted much attention due to its simple implementation and the popularity of bootstrapping. By learning diverse classifiers from resampled datasets and averaging the outcomes, bagging investigates the possibility of achieving substantial classification performance of the base classifier. Diversity has been recognized as a very important characteristic in bagging. This paper presents an efficient and effective bagging approach, that learns a set of independent Bayesian network classifiers (BNCs) from disjoint data subspaces. The number of bits needed to describe the data is measured in terms of log likelihood, and redundant edges are identified to optimize the topologies of the learned BNCs. Our extensive experimental evaluation on 54 publicly available datasets from the UCI machine learning repository reveals that the proposed algorithm achieves a competitive classification performance compared with state-of-the-art BNCs that use or do not use bagging procedures, such as tree-augmented naive Bayes (TAN), k-dependence Bayesian classifier (KDB), bagging NB or bagging TAN.


Author(s):  
Andy Shih ◽  
Arthur Choi ◽  
Adnan Darwiche

We propose an approach for explaining Bayesian network classifiers, which is based on compiling such classifiers into decision functions that have a tractable and symbolic form. We introduce two types of explanations for why a classifier may have classified an instance positively or negatively and suggest algorithms for computing these explanations. The first type of explanation identifies a minimal set of the currently active features that is responsible for the current classification, while the second type of explanation identifies a minimal set of features whose current state (active or not) is sufficient for the classification. We consider in particular the compilation of Naive and Latent-Tree Bayesian network classifiers into Ordered Decision Diagrams (ODDs), providing a context for evaluating our proposal using case studies and experiments based on classifiers from the literature.


Sign in / Sign up

Export Citation Format

Share Document