Hierarchical Reduction Algorithm of Rough Sets

Author(s):  
Li Yurong ◽  
Qiao Bin
2016 ◽  
Vol 9 (8) ◽  
pp. 333-346 ◽  
Author(s):  
Khaled Alwesabi ◽  
Weihua Gui ◽  
Chunhua Yang ◽  
Hamdi Rajeh

Complexity ◽  
2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Tengfei Zhang ◽  
Fumin Ma ◽  
Jie Cao ◽  
Chen Peng ◽  
Dong Yue

Parallel attribute reduction is one of the most important topics in current research on rough set theory. Although some parallel algorithms were well documented, most of them are still faced with some challenges for effectively dealing with the complex heterogeneous data including categorical and numerical attributes. Aiming at this problem, a novel attribute reduction algorithm based on neighborhood multigranulation rough sets was developed to process the massive heterogeneous data in the parallel way. The MapReduce-based parallelization method for attribute reduction was proposed in the framework of neighborhood multigranulation rough sets. To improve the reduction efficiency, the hashing Map/Reduce functions were designed to speed up the positive region calculation. Thereafter, a quick parallel attribute reduction algorithm using MapReduce was developed. The effectiveness and superiority of this parallel algorithm were demonstrated by theoretical analysis and comparison experiments.


2014 ◽  
Vol 1049-1050 ◽  
pp. 665-668
Author(s):  
Hong Li Lv

This paper studies the power transformer fault quality diagnosis using rough sets theory and neural network. It is rough sets reduction as the pre-unit of neural network based on reduction algorithm with the attribute significance. The paper describes the reduction algorithm and implementation method detailed. Through the training and testing results with practical data, it is proved that the reduction algorithm with the attribute significance can make the number of input samples shorter, the training speed faster and the diagnostic accuracy higher. The algorithm is feasible and effective for applying to the fault diagnosis system of power transformer.


Sign in / Sign up

Export Citation Format

Share Document