Proceedings of the Seventeenth International Conference on Principles of Knowledge Representation and Reasoning
Latest Publications


TOTAL DOCUMENTS

94
(FIVE YEARS 94)

H-INDEX

1
(FIVE YEARS 1)

Published By International Joint Conferences On Artificial Intelligence Organization

9780999241172

Author(s):  
Alessandro Umbrico ◽  
Gabriella Cortellessa ◽  
Andrea Orlandini ◽  
Amedeo Cesta

A key aspect of robotic assistants is their ability to contextualize their behavior according to different needs of assistive scenarios. This work presents an ontology-based knowledge representation and reasoning approach supporting the synthesis of personalized behavior of robotic assistants. It introduces an ontological model of health state and functioning of persons based on the International Classification of Functioning, Disability and Health. Moreover, it borrows the concepts of affordance and function from the literature of robotics and manufacturing and adapts them to robotic (physical and cognitive) assistance domain. Knowledge reasoning mechanisms are developed on top of the resulting ontological model to reason about stimulation capabilities of a robot and health state of a person in order to identify action opportunities and achieve personalized assistance. Experimental tests assess the performance of the proposed approach and its capability of dealing with different profiles and stimuli.


Author(s):  
Yaniv Aspis ◽  
Krysia Broda ◽  
Alessandra Russo ◽  
Jorge Lobo

We introduce a novel approach for the computation of stable and supported models of normal logic programs in continuous vector spaces by a gradient-based search method. Specifically, the application of the immediate consequence operator of a program reduct can be computed in a vector space. To do this, Herbrand interpretations of a propositional program are embedded as 0-1 vectors in $\mathbb{R}^N$ and program reducts are represented as matrices in $\mathbb{R}^{N \times N}$. Using these representations we prove that the underlying semantics of a normal logic program is captured through matrix multiplication and a differentiable operation. As supported and stable models of a normal logic program can now be seen as fixed points in a continuous space, non-monotonic deduction can be performed using an optimisation process such as Newton's method. We report the results of several experiments using synthetically generated programs that demonstrate the feasibility of the approach and highlight how different parameter values can affect the behaviour of the system.


Author(s):  
Nico Potyka

Bipolar abstract argumentation frameworks allow modeling decision problems by defining pro and contra arguments and their relationships. In some popular bipolar frameworks, there is an inherent tendency to favor either attack or support relationships. However, for some applications, it seems sensible to treat attack and support equally. Roughly speaking, turning an attack edge into a support edge, should just invert its meaning. We look at a recently introduced bipolar argumentation semantics and two novel alternatives and discuss their semantical and computational properties. Interestingly, the two novel semantics correspond to stable semantics if no support relations are present and maintain the computational complexity of stable semantics in general bipolar frameworks.


Author(s):  
Markus Krötzsch

To reason with existential rules (a.k.a. tuple-generating dependencies), one often computes universal models. Among the many such models of different structure and cardinality, the core is arguably the “best”. Especially for finitely satisfiable theories, where the core is the unique smallest universal model, it has advantages in query answering, non-monotonic reasoning, and data exchange. Unfortunately, computing cores is difficult and not supported by most reasoners. We therefore propose ways of computing cores using practically implemented methods from rule reasoning and answer set programming. Our focus is on cases where the standard chase algorithm produces a core. We characterise this desirable situation in general terms that apply to a large class of cores, derive concrete approaches for decidable special cases, and generalise these approaches to non-monotonic extensions of existential rules.


Author(s):  
Zeynep G. Saribatur ◽  
Thomas Eiter

The recently introduced notion of ASP abstraction is on reducing the vocabulary of a program while ensuring over-approximation of its answer sets, with a focus on having a syntactic operator that constructs an abstract program. It has been shown that such a notion has the potential for program analysis at the abstract level by getting rid of irrelevant details to problem solving while preserving the structure, that aids in the explanation of the solutions. We take here a further look on ASP abstraction, focusing on abstraction by omission with the aim to obtain a better understanding of the notion. We distinguish the key conditions for omission abstraction which sheds light on the differences to the well-studied notion of forgetting. We demonstrate how omission abstraction fits into the overall spectrum, by also investigating its behavior in the semantics of a program in the framework of HT logic.


Author(s):  
Jens Claßen ◽  
James Delgrande

With the advent of artificial agents in everyday life, it is important that these agents are guided by social norms and moral guidelines. Notions of obligation, permission, and the like have traditionally been studied in the field of Deontic Logic, where deontic assertions generally refer to what an agent should or should not do; that is they refer to actions. In Artificial Intelligence, the Situation Calculus is (arguably) the best known and most studied formalism for reasoning about action and change. In this paper, we integrate these two areas by incorporating deontic notions into Situation Calculus theories. We do this by considering deontic assertions as constraints, expressed as a set of conditionals, which apply to complex actions expressed as GOLOG programs. These constraints induce a ranking of "ideality" over possible future situations. This ranking in turn is used to guide an agent in its planning deliberation, towards a course of action that adheres best to the deontic constraints. We present a formalization that includes a wide class of (dyadic) deontic assertions, lets us distinguish prima facie from all-things-considered obligations, and particularly addresses contrary-to-duty scenarios. We furthermore present results on compiling the deontic constraints directly into the Situation Calculus action theory, so as to obtain an agent that respects the given norms, but works solely based on the standard reasoning and planning techniques.


Author(s):  
Marco Console ◽  
Matthias Hofer ◽  
Leonid Libkin

In a variety of reasoning tasks, one estimates the likelihood of events by means of volumes of sets they define. Such sets need to be measurable, which is usually achieved by putting bounds, sometimes ad hoc, on them. We address the question how unbounded or unmeasurable sets can be measured nonetheless. Intuitively, we want to know how likely a randomly chosen point is to be in a given set, even in the absence of a uniform distribution over the entire space. To address this, we follow a recently proposed approach of taking intersection of a set with balls of increasing radius, and defining the measure by means of the asymptotic behavior of the proportion of such balls taken by the set. We show that this approach works for every set definable in first-order logic with the usual arithmetic over the reals (addition, multiplication, exponentiation, etc.), and every uniform measure over the space, of which the usual Lebesgue measure (area, volume, etc.) is an example. In fact we establish a correspondence between the good asymptotic behavior and the finiteness of the VC dimension of definable families of sets. Towards computing the measure thus defined, we show how to avoid the asymptotics and characterize it via a specific subset of the unit sphere. Using definability of this set, and known techniques for sampling from the unit sphere, we give two algorithms for estimating our measure of unbounded unmeasurable sets, with deterministic and probabilistic guarantees, the latter being more efficient. Finally we show that a discrete analog of this measure exists and is similarly well-behaved.


Author(s):  
Meghyn Bienvenu ◽  
Camille Bourgaux

In this paper, we explore the issue of inconsistency handling over prioritized knowledge bases (KBs), which consist of an ontology, a set of facts, and a priority relation between conflicting facts. In the database setting, a closely related scenario has been studied and led to the definition of three different notions of optimal repairs (global, Pareto, and completion) of a prioritized inconsistent database. After transferring the notions of globally-, Pareto- and completion-optimal repairs to our setting, we study the data complexity of the core reasoning tasks: query entailment under inconsistency-tolerant semantics based upon optimal repairs, existence of a unique optimal repair, and enumeration of all optimal repairs. Our results provide a nearly complete picture of the data complexity of these tasks for ontologies formulated in common DL-Lite dialects. The second contribution of our work is to clarify the relationship between optimal repairs and different notions of extensions for (set-based) argumentation frameworks. Among our results, we show that Pareto-optimal repairs correspond precisely to stable extensions (and often also to preferred extensions), and we propose a novel semantics for prioritized KBs which is inspired by grounded extensions and enjoys favourable computational properties. Our study also yields some results of independent interest concerning preference-based argumentation frameworks.


Author(s):  
Tuomo Lehtonen ◽  
Johannes P. Wallner ◽  
Matti Järvisalo

A major research direction in AI argumentation is the study and development of practical computational techniques for reasoning in different argumentation formalisms. Compared to abstract argumentation, developing algorithmic techniques for different structured argumentation formalisms, such as assumption-based argumentation and the general ASPIC+ framework, is more challenging. At present, there is a lack of efficient approaches to reasoning in ASPIC+. We develop a direct declarative approach based on answer set programming (ASP) to reasoning in an instantiation of the ASPIC+ framework. We establish formal foundations for direct declarative encodings for reasoning in ASPIC+ without preferences for several central argumentation semantics, and detail ASP encodings of semantics for which reasoning about acceptance is NP-hard in ASPIC+. Empirically, the ASP approach scales up to frameworks of significant size, thereby answering the current lack of practical computational approaches to reasoning in ASPIC+ and providing a promising base for capturing further generalizations within ASPIC+.


Author(s):  
Gianluca Cima ◽  
Maurizio Lenzerini ◽  
Antonella Poggi

In Ontology-Based Data Access (OBDA), a domain ontology is linked to the data sources of an organization in order to query, integrate and manage data through the concepts and relations of the domain of interest, thus abstracting from the technical details of the data layer implementation. While the great majority of contributions in OBDA in the last decade have been concerned with the issue of computing the answers of queries expressed over the ontology, recent papers address a different problem, namely the one of providing suitable abstractions of data services, i.e., characterizing or explaining the semantics of queries over the sources in terms of queries over the domain ontology. Current works on this subject are based on expressing abstractions in terms of unions of conjunctive queries (UCQs) over the ontology. In this paper we advocate the use of a non-monotonic language for this task. As a first contribution, we present a simple extension of UCQs with non-monotonic features, and show that non-monotonicity provides more expressive power in characterizing the semantics of data services. A second contribution is to prove that, similarly to the case of monotonic abstractions, depending on the expressive power of the languages used to specify the various components of the OBDA system, there are cases where neither perfect nor approximated abstractions exist for a given data service. As a third contribution, we single out interesting special cases where the existence of abstractions is guaranteed, and we present algorithms for computing such abstractions in these cases.


Sign in / Sign up

Export Citation Format

Share Document