The Research of Constructing Rough Concept Lattices Model

2011 ◽  
Vol 63-64 ◽  
pp. 664-667
Author(s):  
Hong Sheng Xu ◽  
Ting Zhong Wang

Formal concept lattices and rough set theory are two kinds of complementary mathematical tools for data analysis and data processing. The algorithm of concept lattice reduction based on variable precision rough set is proposed by combining the algorithms of β-upper and lower distribution reduction in variable precision rough set. The traditional algorithms aboutβvalue select algorithm, attribute reduction based on discernibility matrix and extraction rule in VPRS are discussed, there are defects in these traditional algorithms which are improved. Finally, the generation system of concept lattice based on variable precision rough set is designed to verify the validity of the improved algorithm and a case demonstrates the whole process of concept lattice construction.

2011 ◽  
Vol 58-60 ◽  
pp. 1664-1670
Author(s):  
Hong Sheng Xu ◽  
Rui Ling Zhang

Formal concept analysis (FCA) is based on a formalization of the philosophical understanding of a concept as a unit of thought constituted by its extent and intent. The rough set philosophy is founded on the assumption that with every object of the universe of discourse we associate some information. This paper deals with approaches to knowledge reduction in generalized consistent decision formal context. Finally, a new system model of semantic web based on FCA and rough set is proposed, which preserve more structural and featural information of concept lattice. In order to obtain the concept lattices with relatively less attributes and objects, we study the reduction of the concept lattices based on FCA and rough set theory. The experimental results indicate that this method has great promise.


2011 ◽  
Vol 121-126 ◽  
pp. 1579-1584
Author(s):  
Hai Zhong Tan

The rule set which is acquired based on rough set theory can be classified into two categories: deterministic rules and probabilistic rules. Traditional attribute reduction definitions in variable precision rough set model cannot guarantee the rule properties, namely deterministic or probabilistic. In this paper, a new criterion for attribute reduction is put forward based on variable precision rough set model. The rule properties can be preserved during the process of attribute reduction. The relationships between the new reduct definition and available definitions, including Ziarko’s reduct definition and β lower distribution reduct definition are also discussed.


2011 ◽  
Vol 219-220 ◽  
pp. 202-205
Author(s):  
Hong Sheng Xu ◽  
Jia Song

Variable precision rough set (VPRS) model and formal concept analysis are studied in this paper, include algorithm of reduction attribute and extraction rule. The traditional algorithms about attribute reduction based on discernibility matrix and extraction rule in VPRS are discussed, there are problems in these traditional algorithms which are improved. Rough concept lattice model is proposed based on integrating of variable precision rough set model and formal concept analysis, and is used to reduce formal context. The domain ontology model of e-business is built combined with knowledge of domain expert, and original ontology model of the United Nations Standard Products and Services Classification Code by way of core ontology in order to enhance system robustness and efficiency.


Information ◽  
2019 ◽  
Vol 10 (2) ◽  
pp. 78 ◽  
Author(s):  
Jingpu Zhang ◽  
Ronghui Liu ◽  
Ligeng Zou ◽  
Licheng Zeng

Formal concept analysis has proven to be a very effective method for data analysis and rule extraction, but how to build formal concept lattices is a difficult and hot topic. In this paper, an efficient and rapid incremental concept lattice construction algorithm is proposed. The algorithm, named FastAddExtent, is seen as a modification of AddIntent in which we improve two fundamental procedures, including fixing the covering relation and searching the canonical generator. The proposed algorithm can locate the desired concept quickly by adding data fields to every concept. The algorithm is depicted in detail, using a formal context to show how the new algorithm works and discussing time and space complexity issues. We also present an experimental evaluation of its performance and comparison with AddExtent. Experimental results show that the FastAddExtent algorithm can improve efficiency compared with the primitive AddExtent algorithm.


2012 ◽  
Vol 9 (3) ◽  
pp. 1-17 ◽  
Author(s):  
D. Calvo-Dmgz ◽  
J. F. Gálvez ◽  
D. Glez-Peña ◽  
S. Gómez-Meire ◽  
F. Fdez-Riverola

Summary DNA microarrays have contributed to the exponential growth of genomic and experimental data in the last decade. This large amount of gene expression data has been used by researchers seeking diagnosis of diseases like cancer using machine learning methods. In turn, explicit biological knowledge about gene functions has also grown tremendously over the last decade. This work integrates explicit biological knowledge, provided as gene sets, into the classication process by means of Variable Precision Rough Set Theory (VPRS). The proposed model is able to highlight which part of the provided biological knowledge has been important for classification. This paper presents a novel model for microarray data classification which is able to incorporate prior biological knowledge in the form of gene sets. Based on this knowledge, we transform the input microarray data into supergenes, and then we apply rough set theory to select the most promising supergenes and to derive a set of easy interpretable classification rules. The proposed model is evaluated over three breast cancer microarrays datasets obtaining successful results compared to classical classification techniques. The experimental results shows that there are not significat differences between our model and classical techniques but it is able to provide a biological-interpretable explanation of how it classifies new samples.


Author(s):  
Dong Xu ◽  
Xin Wang ◽  
Yulong Meng ◽  
Ziying Zhang

Discretization of multidimensional attributes can improve the training speed and accuracy of machine learning algorithm. At present, the discretization algorithms perform at a lower level, and most of them are single attribute discretization algorithm, ignoring the potential association between attributes. Based on this, we proposed a discretization algorithm based on forest optimization and rough set (FORDA) in this paper. To solve the problem of discretization of multi-dimensional attributes, the algorithm designs the appropriate value function according to the variable precision rough set theory, and then constructs the forest optimization network and iteratively searches for the optimal subset of breakpoints. The experimental results on the UCI datasets show that:compared with the current mainstream discretization algorithms, the algorithm can avoid local optimization, significantly improve the classification accuracy of the SVM classifier, and its discretization performance is better, which verifies the effectiveness of the algorithm.


2010 ◽  
Vol 40-41 ◽  
pp. 443-447
Author(s):  
Peng Hong ◽  
Wang Cong

In view of the current application deficiency of neuro-fuzzy network, a new optimal method of neurofuzzy network based on variable precision rough set is presented and its application in complex systems modeling is discussed. This method takes the β classification accuracy of variable precision rough set theory as information function to select the condition attribute, and then modeling data are discredited through selecting a proper precision to forms a decision table. Finally, the significant attributes and the key attribute values are extract from the decision table by using reduction algorithm based on variable precision, and are map pad into the fuzzy rules.It simplifies the fuzzy rules and therefore optimize the structure of neuro- fuzzy network effectively, reducing the training time of neural network greatly and improving the precision of training. This method has been applied to the modeling of non-linear time-delay system with a large number of sam-pling data, the validity and feasibility of this method is demonstrated by an example of modeling.


Sign in / Sign up

Export Citation Format

Share Document