A New Criterion for Attribute Reduction Based on Variable Precision Rough Set Model

2011 ◽  
Vol 121-126 ◽  
pp. 1579-1584
Author(s):  
Hai Zhong Tan

The rule set which is acquired based on rough set theory can be classified into two categories: deterministic rules and probabilistic rules. Traditional attribute reduction definitions in variable precision rough set model cannot guarantee the rule properties, namely deterministic or probabilistic. In this paper, a new criterion for attribute reduction is put forward based on variable precision rough set model. The rule properties can be preserved during the process of attribute reduction. The relationships between the new reduct definition and available definitions, including Ziarko’s reduct definition and β lower distribution reduct definition are also discussed.

2012 ◽  
Vol 9 (3) ◽  
pp. 1-17 ◽  
Author(s):  
D. Calvo-Dmgz ◽  
J. F. Gálvez ◽  
D. Glez-Peña ◽  
S. Gómez-Meire ◽  
F. Fdez-Riverola

Summary DNA microarrays have contributed to the exponential growth of genomic and experimental data in the last decade. This large amount of gene expression data has been used by researchers seeking diagnosis of diseases like cancer using machine learning methods. In turn, explicit biological knowledge about gene functions has also grown tremendously over the last decade. This work integrates explicit biological knowledge, provided as gene sets, into the classication process by means of Variable Precision Rough Set Theory (VPRS). The proposed model is able to highlight which part of the provided biological knowledge has been important for classification. This paper presents a novel model for microarray data classification which is able to incorporate prior biological knowledge in the form of gene sets. Based on this knowledge, we transform the input microarray data into supergenes, and then we apply rough set theory to select the most promising supergenes and to derive a set of easy interpretable classification rules. The proposed model is evaluated over three breast cancer microarrays datasets obtaining successful results compared to classical classification techniques. The experimental results shows that there are not significat differences between our model and classical techniques but it is able to provide a biological-interpretable explanation of how it classifies new samples.


Author(s):  
Malcolm J. Beynon ◽  
Benjamin Griffiths

This chapter considers, and elucidates, the general methodology of rough set theory (RST), a nascent approach to rule based classification associated with soft computing. There are two parts of the elucidation undertaken in this chapter, firstly the levels of possible pre-processing necessary when undertaking an RST based analysis, and secondly the presentation of an analysis using variable precision rough sets (VPRS), a development on the original RST that allows for misclassification to exist in the constructed “if … then …” decision rules. Throughout the chapter, bespoke software underpins the pre-processing and VPRS analysis undertaken, including screenshots of its output. The problem of US bank credit ratings allows the pertinent demonstration of the soft computing approaches described throughout.


2011 ◽  
Vol 63-64 ◽  
pp. 664-667
Author(s):  
Hong Sheng Xu ◽  
Ting Zhong Wang

Formal concept lattices and rough set theory are two kinds of complementary mathematical tools for data analysis and data processing. The algorithm of concept lattice reduction based on variable precision rough set is proposed by combining the algorithms of β-upper and lower distribution reduction in variable precision rough set. The traditional algorithms aboutβvalue select algorithm, attribute reduction based on discernibility matrix and extraction rule in VPRS are discussed, there are defects in these traditional algorithms which are improved. Finally, the generation system of concept lattice based on variable precision rough set is designed to verify the validity of the improved algorithm and a case demonstrates the whole process of concept lattice construction.


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Hua Li ◽  
Deyu Li ◽  
Yanhui Zhai ◽  
Suge Wang ◽  
Jing Zhang

Owing to the high dimensionality of multilabel data, feature selection in multilabel learning will be necessary in order to reduce the redundant features and improve the performance of multilabel classification. Rough set theory, as a valid mathematical tool for data analysis, has been widely applied to feature selection (also called attribute reduction). In this study, we propose a variable precision attribute reduct for multilabel data based on rough set theory, calledδ-confidence reduct, which can correctly capture the uncertainty implied among labels. Furthermore, judgement theory and discernibility matrix associated withδ-confidence reduct are also introduced, from which we can obtain the approach to knowledge reduction in multilabel decision tables.


2014 ◽  
Vol 2014 ◽  
pp. 1-5 ◽  
Author(s):  
Yanqing Zhu ◽  
William Zhu

Classical rough set theory is a technique of granular computing for handling the uncertainty, vagueness, and granularity in information systems. Covering-based rough sets are proposed to generalize this theory for dealing with covering data. By introducing a concept of misclassification rate functions, an extended variable precision covering-based rough set model is proposed in this paper. In addition, we define thef-lower andf-upper approximations in terms of neighborhoods in the extended model and study their properties. Particularly, two coverings with the same reductions are proved to generate the samef-lower andf-upper approximations. Finally, we discuss the relationships between the new model and some other variable precision rough set models.


Sign in / Sign up

Export Citation Format

Share Document