dynamic ensemble selection
Recently Published Documents


TOTAL DOCUMENTS

73
(FIVE YEARS 26)

H-INDEX

12
(FIVE YEARS 3)

2022 ◽  
Vol 139 ◽  
pp. 368-382
Author(s):  
Yi Feng ◽  
Yunqiang Yin ◽  
Dujuan Wang ◽  
Lalitha Dhamotharan

2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Tinofirei Museba ◽  
Fulufhelo Nelwamondo ◽  
Khmaies Ouahada

In recent years, the prevalence of technological advances has led to an enormous and ever-increasing amount of data that are now commonly available in a streaming fashion. In such nonstationary environments, the underlying process generating the data stream is characterized by an intrinsic nonstationary or evolving or drifting phenomenon known as concept drift. Given the increasingly common applications whose data generation mechanisms are susceptible to change, the need for effective and efficient algorithms for learning from and adapting to evolving or drifting environments can hardly be overstated. In dynamic environments associated with concept drift, learning models are frequently updated to adapt to changes in the underlying probability distribution of the data. A lot of work in the area of learning in nonstationary environments focuses on updating the learning predictive model to optimize recovery from concept drift and convergence to new concepts by adjusting parameters and discarding poorly performing models while little effort has been dedicated to investigate what type of learning model is suitable at any given time for different types of concept drift. In this paper, we investigate the impact of heterogeneous online ensemble learning based on online model selection for predictive modeling in dynamic environments. We propose a novel heterogeneous ensemble approach based on online dynamic ensemble selection that accurately interchanges between different types of base models in an ensemble to enhance its predictive performance in nonstationary environments. The approach is known as Heterogeneous Dynamic Ensemble Selection based on Accuracy and Diversity (HDES-AD) and makes use of models generated by different base learners to increase diversity to circumvent problems associated with existing dynamic ensemble classifiers that may experience loss of diversity due to the exclusion of base learners generated by different base algorithms. The algorithm is evaluated on artificial and real-world datasets with well-known online homogeneous online ensemble approaches such as DDD, AFWE, and OAUE. The results show that HDES-AD performed significantly better than the other three homogeneous online ensemble approaches in nonstationary environments.


Author(s):  
Thi Ngoc Anh Nguyen ◽  
Quynh Pham Nhu ◽  
Vijender Kumar Solanki

Background: Ensemble selection is one of the most researched topics for ensemble learning. Researchers have been attracted to selecting a subset of base classifiers that may perform more helpful than the whole ensemble system classifiers. Dynamic Ensemble Selection (DES) is one of the most effective techniques in classification problems. DES systems obtain to select the most appropriate classifiers from the candidate classifier pool. Ensemble models that balance diversity and accuracy in the training process improve performance than the whole classifiers. Objective: In this paper, the novel techniques are proposed by combining Noise Filter (NF) and Dynamic Ensemble System (DES) to have better predictive accuracy. In other words, a noise filter and DES make the data cleaner and DES improve the performance of classification. Methods: The proposed NF-DES model had been demonstrated on twelve datasets, especially has three credit scoring datasets and a performance measure accuracy. Results: The results show that our proposed model has better than other models. Conclusion: The novel noise filer and dynamic ensemble learning with aim to improve the classification ability are presented. To improve performance of classification, noise filter with dynamic ensemble learning makes the noise data toward the correct class. Then, novel dynamic ensemble learning choose the appropriate subset classifiers in the pool of base classifiers.


Sign in / Sign up

Export Citation Format

Share Document