Evolutionary structure learning algorithm for Bayesian network and Penalized Mutual Information metric

Author(s):  
Gang Li ◽  
Fu Tong ◽  
Honghua Dai
Entropy ◽  
2019 ◽  
Vol 21 (11) ◽  
pp. 1102 ◽  
Author(s):  
Jing Chen ◽  
Jun Feng ◽  
Jingzhao Hu ◽  
Xia Sun

Over the past few years, online learning has exploded in popularity due to the potentially unlimited enrollment, lack of geographical limitations, and free accessibility of many courses. However, learners are prone to have poor performance due to the unconstrained learning environment, lack of academic pressure, and low interactivity. Personalized intervention design with the learners’ background and learning behavior factors in mind may improve the learners’ performance. Causality strictly distinguishes cause from outcome factors and plays an irreplaceable role in designing guiding interventions. The goal of this paper is to construct a Bayesian network to make causal analysis and then provide personalized interventions for different learners to improve learning. This paper first constructs a Bayesian network based on background and learning behavior factors, combining expert knowledge and a structure learning algorithm. Then the important factors in the constructed network are selected using mutual information based on entropy. At last, we identify learners with poor performance using inference and propose personalized interventions, which may help with successful applications in education. Experimental results verify the effectiveness of the proposed method and demonstrate the impact of factors on learning performance.


2013 ◽  
Vol 427-429 ◽  
pp. 1614-1619
Author(s):  
Shao Jin Han ◽  
Jian Xun Li

The traditional structure learning algorithms are mainly faced with a large sample dataset. But the sample dataset practically is small. Based on it, we introduce the Probability Density Kernel Estimation (PDKE), which would achieve the expansion of the original sample sets. Then, the K2 algorithm is used to learn the Bayesian network structure. By optimizing the kernel function and window width, PDKE achieves the effective expansion of the original dataset. After the confirm of variable order based on mutual information, a small sample set of Bayesian structure learning algorithm would be established. Finally, simulation results confirm that the new algorithm is effective and practical.


2021 ◽  
Vol 2021 ◽  
pp. 1-17
Author(s):  
Ruo-Hai Di ◽  
Ye Li ◽  
Ting-Peng Li ◽  
Lian-Dong Wang ◽  
Peng Wang

Dynamic programming is difficult to apply to large-scale Bayesian network structure learning. In view of this, this article proposes a BN structure learning algorithm based on dynamic programming, which integrates improved MMPC (maximum-minimum parents and children) and MWST (maximum weight spanning tree). First, we use the maximum weight spanning tree to obtain the maximum number of parent nodes of the network node. Second, the MMPC algorithm is improved by the symmetric relationship to reduce false-positive nodes and obtain the set of candidate parent-child nodes. Finally, with the maximum number of parent nodes and the set of candidate parent nodes as constraints, we prune the parent graph of dynamic programming to reduce the number of scoring calculations and the complexity of the algorithm. Experiments have proved that when an appropriate significance level α is selected, the MMPCDP algorithm can greatly reduce the number of scoring calculations and running time while ensuring its accuracy.


Sign in / Sign up

Export Citation Format

Share Document