scholarly journals Adaptive estimation of multivariate piecewise polynomials and bounded variation functions by optimal decision trees

2021 ◽  
Vol 49 (5) ◽  
Author(s):  
Sabyasachi Chatterjee ◽  
Subhajit Goswami
2003 ◽  
Vol 2003 (31) ◽  
pp. 2003-2009 ◽  
Author(s):  
Vijay Gupta ◽  
Niraj Kumar

Guo (1988) introduced the integral modification of Meyer-Kö nig and Zeller operatorsMˆnand studied the rate of convergence for functions of bounded variation. Gupta (1995) gave the sharp estimate for the operatorsMˆn. Zeng (1998) gave the exact bound and claimed to improve the results of Guo and Gupta, but there is a major mistake in the paper of Zeng. In the present note, we give the correct estimate for the rate of convergence on bounded variation functions.


10.37236/1900 ◽  
2005 ◽  
Vol 12 (1) ◽  
Author(s):  
Jakob Jonsson

We consider topological aspects of decision trees on simplicial complexes, concentrating on how to use decision trees as a tool in topological combinatorics. By Robin Forman's discrete Morse theory, the number of evasive faces of a given dimension $i$ with respect to a decision tree on a simplicial complex is greater than or equal to the $i$th reduced Betti number (over any field) of the complex. Under certain favorable circumstances, a simplicial complex admits an "optimal" decision tree such that equality holds for each $i$; we may hence read off the homology directly from the tree. We provide a recursive definition of the class of semi-nonevasive simplicial complexes with this property. A certain generalization turns out to yield the class of semi-collapsible simplicial complexes that admit an optimal discrete Morse function in the analogous sense. In addition, we develop some elementary theory about semi-nonevasive and semi-collapsible complexes. Finally, we provide explicit optimal decision trees for several well-known simplicial complexes.


Author(s):  
Hélène Verhaeghe ◽  
Siegfried Nijssen ◽  
Gilles Pesant ◽  
Claude-Guy Quimper ◽  
Pierre Schaus

Decision trees are among the most popular classification models in machine learning. Traditionally, they are learned using greedy algorithms. However, such algorithms have their disadvantages: it is difficult to limit the size of the decision trees while maintaining a good classification accuracy, and it is hard to impose additional constraints on the models that are learned. For these reasons, there has been a recent interest in exact and flexible algorithms for learning decision trees. In this paper, we introduce a new approach to learn decision trees using constraint programming. Compared to earlier approaches, we show that our approach obtains better performance, while still being sufficiently flexible to allow for the inclusion of constraints. Our approach builds on three key building blocks: (1) the use of AND/OR search, (2) the use of caching, (3) the use of the CoverSize global constraint proposed recently for the problem of itemset mining. This allows our constraint programming approach to deal in a much more efficient way with the decompositions in the learning problem.


Author(s):  
Gaël Aglin ◽  
Siegfried Nijssen ◽  
Pierre Schaus

Decision Trees (DTs) are widely used Machine Learning (ML) models with a broad range of applications. The interest in these models has increased even further in the context of Explainable AI (XAI), as decision trees of limited depth are very interpretable models. However, traditional algorithms for learning DTs are heuristic in nature; they may produce trees that are of suboptimal quality under depth constraints. We introduce PyDL8.5, a Python library to infer depth-constrained Optimal Decision Trees (ODTs). PyDL8.5 provides an interface for DL8.5, an efficient algorithm for inferring depth-constrained ODTs. The library provides an easy-to-use scikit-learn compatible interface. It cannot only be used for classification tasks, but also for regression, clustering, and other tasks. We introduce an interface that allows users to easily implement these other learning tasks. We provide a number of examples of how to use this library.


2019 ◽  
Vol 157 ◽  
pp. 173-180 ◽  
Author(s):  
R. González Perea ◽  
E. Camacho Poyato ◽  
P. Montesinos ◽  
J.A. Rodríguez Díaz

2019 ◽  
Vol 165 (3-4) ◽  
pp. 245-261
Author(s):  
Abdulla Aldilaijan ◽  
Mohammad Azad ◽  
Mikhail Moshkov

Fractals ◽  
2017 ◽  
Vol 25 (05) ◽  
pp. 1750048 ◽  
Author(s):  
Y. S. LIANG

The present paper mainly investigates the definition and classification of one-dimensional continuous functions on closed intervals. Continuous functions can be classified as differentiable functions and nondifferentiable functions. All differentiable functions are of bounded variation. Nondifferentiable functions are composed of bounded variation functions and unbounded variation functions. Fractal dimension of all bounded variation continuous functions is 1. One-dimensional unbounded variation continuous functions may have finite unbounded variation points or infinite unbounded variation points. Number of unbounded variation points of one-dimensional unbounded variation continuous functions maybe infinite and countable or uncountable. Certain examples of different one-dimensional continuous functions have been given in this paper. Thus, one-dimensional continuous functions are composed of differentiable functions, nondifferentiable continuous functions of bounded variation, continuous functions with finite unbounded variation points, continuous functions with infinite but countable unbounded variation points and continuous functions with uncountable unbounded variation points. In the end of the paper, we give an example of one-dimensional continuous function which is of unbounded variation everywhere.


Sign in / Sign up

Export Citation Format

Share Document