scholarly journals Processing GIS Data Using Decision Trees and an Inductive Learning Method

2021 ◽  
Vol 11 (6) ◽  
pp. 393-398
Author(s):  
Dana Mihai ◽  
◽  
Mihai Mocanu
Author(s):  
Malcolm Beynon

The general fuzzy decision tree approach encapsulates the benefits of being an inductive learning technique to classify objects, utilising the richness of the data being considered, as well as the readability and interpretability that accompanies its operation in a fuzzy environment. This chapter offers a description of fuzzy decision tree based research, including the exposition of small and large fuzzy decision trees to demonstrate their construction and practicality. The two large fuzzy decision trees described are associated with a real application, namely, the identification of workplace establishments in the United Kingdom that pay a noticeable proportion of their employees less than the legislated minimum wage. Two separate fuzzy decision tree analyses are undertaken on a low-pay database, which utilise different numbers of membership functions to fuzzify the continuous attributes describing the investigated establishments. The findings demonstrate the sensitivity of results when there are changes in the compactness of the fuzzy representation of the associated data.


1994 ◽  
Author(s):  
James W. Pascoe ◽  
Harry H. Robertshaw ◽  
David H. Kiel

2003 ◽  
Vol 32 (2-3) ◽  
pp. 131-152 ◽  
Author(s):  
Mario Drobics ◽  
Ulrich Bodenhofer ◽  
Erich Peter Klement

2003 ◽  
Vol 24 (1-3) ◽  
pp. 273-282 ◽  
Author(s):  
George V. Lashkia ◽  
Laurence Anthony

2021 ◽  
Author(s):  
Shiyou Lian

Starting from finding approximate value of a function, introduces the measure of approximation-degree between two numerical values, proposes the concepts of “strict approximation” and “strict approximation region”, then, derives the corresponding one-dimensional interpolation methods and formulas, and then presents a calculation model called “sum-times-difference formula” for high-dimensional interpolation, thus develops a new interpolation approach, that is, ADB interpolation. ADB interpolation is applied to the interpolation of actual functions with satisfactory results. Viewed from principle and effect, the interpolation approach is of novel idea, and has the advantages of simple calculation, stable accuracy, facilitating parallel processing, very suiting for high-dimensional interpolation, and easy to be extended to the interpolation of vector valued functions. Applying the approach to instance-based learning, a new instance-based learning method, learning using ADB interpolation, is obtained. The learning method is of unique technique, which has also the advantages of definite mathematical basis, implicit distance weights, avoiding misclassification, high efficiency, and wide range of applications, as well as being interpretable, etc. In principle, this method is a kind of learning by analogy, which and the deep learning that belongs to inductive learning can complement each other, and for some problems, the two can even have an effect of “different approaches but equal results” in big data and cloud computing environment. Thus, the learning using ADB interpolation can also be regarded as a kind of “wide learning” that is dual to deep learning.


1997 ◽  
Vol 21 (2) ◽  
pp. 61-73 ◽  
Author(s):  
Bingchiang Jeng ◽  
Yung-Mo Jeng ◽  
Ting-Peng Liang

Author(s):  
Eva Armengol ◽  
Susana Puig

In this chapter, the authors propose an approach for building a model characterizing malignant melanomas. A common way to build a domain model is using an inductive learning method. Such resulting model is a generalization of the known examples. However, in some domains where there is not a clear difference among the classes, the inductive model could be too general. The approach taken in this chapter consists of using lazy learning methods for building what the authors call a lazy domain theory. The main difference between both inductive and lazy theories is that the former is complete whereas the latter is not. This means that the lazy domain theory may not cover all the space of known examples. The authors’ experiments have shown that, despite of this, the lazy domain theory has better performance than the inductive theory.


Sign in / Sign up

Export Citation Format

Share Document