Two feature weighting approaches for naive Bayes text classifiers

2016 ◽  
Vol 100 ◽  
pp. 137-144 ◽  
Author(s):  
Lungan Zhang ◽  
Liangxiao Jiang ◽  
Chaoqun Li ◽  
Ganggang Kong
IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 20151-20159 ◽  
Author(s):  
Shufen Ruan ◽  
Hongwei Li ◽  
Chaoqun Li ◽  
Kunfang Song

Author(s):  
Han-joon Kim

This chapter introduces two practical techniques for improving Naïve Bayes text classifiers that are widely used for text classification. The Naïve Bayes has been evaluated to be a practical text classification algorithm due to its simple classification model, reasonable classification accuracy, and easy update of classification model. Thus, many researchers have a strong incentive to improve the Naïve Bayes by combining it with other meta-learning approaches such as EM (Expectation Maximization) and Boosting. The EM approach is to combine the Naïve Bayes with the EM algorithm and the Boosting approach is to use the Naïve Bayes as a base classifier in the AdaBoost algorithm. For both approaches, a special uncertainty measure fit for Naïve Bayes learning is used. In the Naïve Bayes learning framework, these approaches are expected to be practical solutions to the problem of lack of training documents in text classification systems.


Sign in / Sign up

Export Citation Format

Share Document