An Instance Based Model for Scalable θ-Subsumption

2018 ◽  
Vol 27 (07) ◽  
pp. 1860011
Author(s):  
Hippolyte Léger ◽  
Dominique Bouthinon ◽  
Mustapha Lebbah ◽  
Hanene Azzag

The θ-subsumption test is known to be a bottleneck in Inductive Logic Programming. The state-of-the-art learning systems in this field are hardly scalable. Last year, we have created a distributed θ-subsumption process based on an Actor Model, with the aim of being able to decide subsumption on very large clauses. This model was correct and complete, but was also very slow. This is why we introduce ANTS (Actor Network based Theta-Subsumption), a new model also based on an actor network, which is significantly faster than the previous one.

Author(s):  
Andrew Cropper ◽  
Sebastijan Dumančic

A major challenge in inductive logic programming (ILP) is learning large programs. We argue that a key limitation of existing systems is that they use entailment to guide the hypothesis search. This approach is limited because entailment is a binary decision: a hypothesis either entails an example or does not, and there is no intermediate position. To address this limitation, we go beyond entailment and use 'example-dependent' loss functions to guide the search, where a hypothesis can partially cover an example. We implement our idea in Brute, a new ILP system which uses best-first search, guided by an example-dependent loss function, to incrementally build programs. Our experiments on three diverse program synthesis domains (robot planning, string transformations, and ASCII art), show that Brute can substantially outperform existing ILP systems, both in terms of predictive accuracies and learning times, and can learn programs 20 times larger than state-of-the-art systems.


Author(s):  
Farhad Shakerin ◽  
Gopal Gupta

We present a heuristic based algorithm to induce nonmonotonic logic programs that will explain the behavior of XGBoost trained classifiers. We use the technique based on the LIME approach to locally select the most important features contributing to the classification decision. Then, in order to explain the model’s global behavior, we propose the LIME-FOLD algorithm —a heuristic-based inductive logic programming (ILP) algorithm capable of learning nonmonotonic logic programs—that we apply to a transformed dataset produced by LIME. Our proposed approach is agnostic to the choice of the ILP algorithm. Our experiments with UCI standard benchmarks suggest a significant improvement in terms of classification evaluation metrics. Meanwhile, the number of induced rules dramatically decreases compared to ALEPH, a state-of-the-art ILP system.


2002 ◽  
Vol 16 ◽  
pp. 135-166 ◽  
Author(s):  
H. Blockeel ◽  
L. Dehaspe ◽  
B. Demoen ◽  
G. Janssens ◽  
J. Ramon ◽  
...  

Inductive logic programming, or relational learning, is a powerful paradigm for machine learning or data mining. However, in order for ILP to become practically useful, the efficiency of ILP systems must improve substantially. To this end, the notion of a query pack is introduced: it structures sets of similar queries. Furthermore, a mechanism is described for executing such query packs. A complexity analysis shows that considerable efficiency improvements can be achieved through the use of this query pack execution mechanism. This claim is supported by empirical results obtained by incorporating support for query pack execution in two existing learning systems.


1996 ◽  
Vol 9 (4) ◽  
pp. 157-206 ◽  
Author(s):  
Nada Lavrač ◽  
Irene Weber ◽  
Darko Zupanič ◽  
Dimitar Kazakov ◽  
Olga Štěpánková ◽  
...  

Author(s):  
Rinaldo Lima ◽  
Bernard Espinasse ◽  
Hilário Oliveira ◽  
Rafael Ferreira ◽  
Luciano Cabral ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document