An efficient dynamic Bayesian network classifier structure learning algorithm: application to sport epidemiology

2020 ◽  
Vol 8 (4) ◽  
Author(s):  
Kyle D Peterson

Abstract Exposing an athlete to intense physical exertion when their organism is not ready for the mobilization of such resources can lead to musculoskeletal injury. In turn, sport practitioners regularly monitor athlete readiness in hopes of mitigating these tragic events. Rapid developments in athlete monitoring technologies has thus resulted in sport practitioners aspiring to siphon meaningful insight from high-throughput datasets. However, revealing the temporal sequence of biological adaptation while yielding accurate probabilistic predictions of an event, demands computationally efficient and accurate algorithms. The purpose of the present study is to create a model in the form of the intuitively appealing dynamic Bayesian network (DBN). Existing DBN approaches can be split into two varieties: either computationally burdensome and thus unscalable, or place structural constraints to increase scalability. This article introduces a novel algorithm ‘rapid incremental search for time-varying associations’ $(Rista)$, to be time-efficient without imposing structural constraints. Furthermore, it offers such flexibility and computational efficiency without compromising prediction performance. The present algorithm displays comparable results to contemporary algorithms in classification accuracy while maintaining superior speed.

2013 ◽  
Vol 427-429 ◽  
pp. 1614-1619
Author(s):  
Shao Jin Han ◽  
Jian Xun Li

The traditional structure learning algorithms are mainly faced with a large sample dataset. But the sample dataset practically is small. Based on it, we introduce the Probability Density Kernel Estimation (PDKE), which would achieve the expansion of the original sample sets. Then, the K2 algorithm is used to learn the Bayesian network structure. By optimizing the kernel function and window width, PDKE achieves the effective expansion of the original dataset. After the confirm of variable order based on mutual information, a small sample set of Bayesian structure learning algorithm would be established. Finally, simulation results confirm that the new algorithm is effective and practical.


2021 ◽  
Vol 2021 ◽  
pp. 1-17
Author(s):  
Ruo-Hai Di ◽  
Ye Li ◽  
Ting-Peng Li ◽  
Lian-Dong Wang ◽  
Peng Wang

Dynamic programming is difficult to apply to large-scale Bayesian network structure learning. In view of this, this article proposes a BN structure learning algorithm based on dynamic programming, which integrates improved MMPC (maximum-minimum parents and children) and MWST (maximum weight spanning tree). First, we use the maximum weight spanning tree to obtain the maximum number of parent nodes of the network node. Second, the MMPC algorithm is improved by the symmetric relationship to reduce false-positive nodes and obtain the set of candidate parent-child nodes. Finally, with the maximum number of parent nodes and the set of candidate parent nodes as constraints, we prune the parent graph of dynamic programming to reduce the number of scoring calculations and the complexity of the algorithm. Experiments have proved that when an appropriate significance level α is selected, the MMPCDP algorithm can greatly reduce the number of scoring calculations and running time while ensuring its accuracy.


Sign in / Sign up

Export Citation Format

Share Document