Attribute selection for decision tree learning with class constraint

2017 ◽  
Vol 163 ◽  
pp. 16-23 ◽  
Author(s):  
Huaining Sun ◽  
Xuegang Hu
Entropy ◽  
2019 ◽  
Vol 21 (2) ◽  
pp. 198 ◽  
Author(s):  
Huaining Sun ◽  
Xuegang Hu ◽  
Yuhong Zhang

Uncertainty evaluation based on statistical probabilistic information entropy is a commonly used mechanism for a heuristic method construction of decision tree learning. The entropy kernel potentially links its deviation and decision tree classification performance. This paper presents a decision tree learning algorithm based on constrained gain and depth induction optimization. Firstly, the calculation and analysis of single- and multi-value event uncertainty distributions of information entropy is followed by an enhanced property of single-value event entropy kernel and multi-value event entropy peaks as well as a reciprocal relationship between peak location and the number of possible events. Secondly, this study proposed an estimated method for information entropy whose entropy kernel is replaced with a peak-shift sine function to establish a decision tree learning (CGDT) algorithm on the basis of constraint gain. Finally, by combining branch convergence and fan-out indices under an inductive depth of a decision tree, we built a constraint gained and depth inductive improved decision tree (CGDIDT) learning algorithm. Results show the benefits of the CGDT and CGDIDT algorithms.


1998 ◽  
Vol 23 (6) ◽  
pp. 111-120 ◽  
Author(s):  
Gou Masuda ◽  
Norihiro Sakamoto ◽  
Kazuo Ushijima

2012 ◽  
Vol 12 (7) ◽  
pp. 2384-2391 ◽  
Author(s):  
Junghwan Cho ◽  
Xiaopeng Li ◽  
Zhiyong Gu ◽  
Pradeep U. Kurup

Sign in / Sign up

Export Citation Format

Share Document