Scalability of Data Decomposition Based Algorithms: Attribute Reduction Problem

Author(s):  
Piotr Hońko
2015 ◽  
Vol 46 (3) ◽  
pp. 599-628 ◽  
Author(s):  
Mehmet Hacibeyoglu ◽  
Mohammad Shukri Salman ◽  
Murat Selek ◽  
Sirzat Kahramanli

2016 ◽  
Vol 21 (20) ◽  
pp. 6159-6173 ◽  
Author(s):  
Anhui Tan ◽  
Weizhi Wu ◽  
Yuzhi Tao

2015 ◽  
Vol 14 (4) ◽  
pp. 3-10
Author(s):  
Demetrovics Janos ◽  
Vu Duc Thi ◽  
Nguyen Long Giang

Abstract The problem of finding reducts plays an important role in processing information on decision tables. The objective of the attribute reduction problem is to reject a redundant attribute in order to find a core attribute for data processing. The attribute reduction in decision tables is the process of finding a minimal subset of conditional attributes which preserve the classification ability of decision tables. In this paper we present the time complexity of the problem of finding all reducts of a consistent decision table. We prove that this time complexity is exponential with respect to the number of attributes of the decision tables. Our proof is performed in two steps. The first step is to show that there exists an exponential algorithm which finds all reducts. The other step is to prove that the time complexity of the problem of finding all reducts of a decision table is not less than exponential.


2013 ◽  
Vol 2013 ◽  
pp. 1-12 ◽  
Author(s):  
Hong Zhao ◽  
Fan Min ◽  
William Zhu

The measurement error with normal distribution is universal in applications. Generally, smaller measurement error requires better instrument and higher test cost. In decision making, we will select an attribute subset with appropriate measurement error to minimize the total test cost. Recently, error-range-based covering rough set with uniform distribution error was proposed to investigate this issue. However, the measurement errors satisfy normal distribution instead of uniform distribution which is rather simple for most applications. In this paper, we introduce normal distribution measurement errors to covering-based rough set model and deal with test-cost-sensitive attribute reduction problem in this new model. The major contributions of this paper are fourfold. First, we build a new data model based on normal distribution measurement errors. Second, the covering-based rough set model with measurement errors is constructed through the “3-sigma” rule of normal distribution. With this model, coverings are constructed from data rather than assigned by users. Third, the test-cost-sensitive attribute reduction problem is redefined on this covering-based rough set. Fourth, a heuristic algorithm is proposed to deal with this problem. The experimental results show that the algorithm is more effective and efficient than the existing one. This study suggests new research trends concerning cost-sensitive learning.


Sign in / Sign up

Export Citation Format

Share Document