scholarly journals A new method for obtaining the inconsistent elements in a decision table based on dominance principle

2012 ◽  
Vol 204-208 ◽  
pp. 4904-4908
Author(s):  
Yi Jie Dun ◽  
Ya Bin Shao ◽  
Shuang Liang Tian

This paper makes use of knowledge granular to present a new method to mine rules based on granule. First, use the measure to measure the importance of attribute, and get the granularity of the universe, and then repeat this procedure to every granule of the granularity, until the decision attribute has only one value for all granules, then we will describe every granule to get the rule. The analysis of the algorithm and the experiment show that the method presented is effective and reliable.Classification rules is the main target of association rule,decision tree and rough sets.a new algorithm to mine classification rules based on the importance of attribute value supported.this algorithm views the importance as the number of tuple pair that can be discernible by it,and the rules obtained from the constructed decision tree is equivalent to those obtained from ID3,which can be proved by the idea of rule fusion.however, this method is of low computation,and is more suitable to large database . rough sets is a techniques applied to data mining problems. This paper presents a new method to extract efficiently classification rules from decision table. The new model uses rough set theory to help in decreasing the computational effort needed for building decision tree by using what is called reduct algorithm and a rules set (knowledge) is generated from the decision table. reliable classifier architecture is obtained and its effectiveness is verified by the experiments comparing with traditional rough set approaches. Data mining research has made much effort to apply various mining algorithms efficiently on large databases.


Author(s):  
Nguyễn Long Giang ◽  
Nguyễn Thanh Tùng ◽  
Vũ Đức Thi

2014 ◽  
Vol 584-586 ◽  
pp. 2640-2643
Author(s):  
Zhi Ding Chen ◽  
Hai Man Gao ◽  
Qi Guo

The rough set theory is a new method for analyzing and dealing with data. By using this theory, we proposed a risk assessment algorithm based on rough set theory, which was described in detail in this paper. the decision table can be simplified and redundant attributes can be got rid of A method of inference based on the knowledge of rough sets and an example to show how to acquire the rules of new decision making, thus filling the method with a practical and publicizing value are given.


2014 ◽  
Vol 926-930 ◽  
pp. 3718-3721
Author(s):  
Yang Liu ◽  
Cong Hua Lan ◽  
Zhan Hong Tang

The paper proposed a new algorithm of attribute value reduction use attribute union, only need to scan decision table one time, through simple operation can get all the concise rules. To avoid the decision information table for repeatedly and a large number of operations, algorithm is presented a new method of calculating rules metrics and rules extraction methods, it can not only get a concise decision rules, but also keep the accuracy of decision rules are the same. Example analysis proves the feasibility of the algorithm, and deal effectively with consistent decision table and inconsistent decision table, it can keep the decision table of information remains the same.


2016 ◽  
Vol 13 (10) ◽  
pp. 7726-7730
Author(s):  
M. El Sayed

The proposed from this paper is view the elimination of the attributes (columns) and the duplicate rows and removing superfluous of attributes values. We obtain the incomplete decision table which is different from the decision table and this table contains the necessary values to make decisions, and also in this paper we introduce new method to generate topology from decision table the degree of dependency between condition attributes and the decision attribute, reduction based on simply open sets. And also we introduce new concept namely, minimal simply open sets, and simply open sets. Also we introduce simply approximation space.


Author(s):  
C. C. Clawson ◽  
L. W. Anderson ◽  
R. A. Good

Investigations which require electron microscope examination of a few specific areas of non-homogeneous tissues make random sampling of small blocks an inefficient and unrewarding procedure. Therefore, several investigators have devised methods which allow obtaining sample blocks for electron microscopy from region of tissue previously identified by light microscopy of present here techniques which make possible: 1) sampling tissue for electron microscopy from selected areas previously identified by light microscopy of relatively large pieces of tissue; 2) dehydration and embedding large numbers of individually identified blocks while keeping each one separate; 3) a new method of maintaining specific orientation of blocks during embedding; 4) special light microscopic staining or fluorescent procedures and electron microscopy on immediately adjacent small areas of tissue.


1960 ◽  
Vol 23 ◽  
pp. 227-232 ◽  
Author(s):  
P WEST ◽  
G LYLES
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document