scholarly journals Comparing Weak Admissibility Semantics to their Dung-style Counterparts (Extended Abstract)

Author(s):  
Ringo Baumann ◽  
Gerhard Brewka ◽  
Markus Ulbricht

Semantics based on weak admissibility were recently introduced to overcome a problem with self-defeating arguments that has not been solved for more than 25 years. The recursive definition of weak admissibility mainly relies on the notion of a reduct regarding a set E which only contains arguments which are neither in E, nor attacked by E. At first glance the reduct seems to be tailored for the weaker versions of Dung-style semantics only. In this paper we show that standard Dung semantics can be naturally reformulated using the reduct revealing that this concept is already implicit. We further identify a new abstract principle for semantics, so-called modularization describing how to obtain further extensions given an initial one. Its importance for the study of abstract argumentation semantics is shown by its ability to alternatively characterize classical and non-classical semantics.

Author(s):  
Ringo Baumann ◽  
Gerhard Brewka ◽  
Markus Ulbricht

Semantics based on weak admissibility were recently introduced to overcome a problem with self-defeating arguments that has not been solved for more than 25 years. The recursive definition of weak admissibility mainly relies on the notion of a reduct regarding a set E which only contains arguments which are neither in E, nor attacked by E. At first glance the reduct seems to be tailored for the weaker versions of Dung-style semantics only. In this paper we show that standard Dung semantics can be naturally reformulated using the reduct revealing that this concept is already implicit. We further identify a new abstract principle for semantics, so-called modularization describing how to obtain further extensions given an initial one. Its importance for the study of abstract argumentation semantics is shown by its ability to alternatively characterize classical and non-classical semantics. Moreover, we tackle the notion of strong equivalence via characterizing kernels and give a complete classification of the weak versions regarding well-known properties and postulates known from the literature.


10.37236/1900 ◽  
2005 ◽  
Vol 12 (1) ◽  
Author(s):  
Jakob Jonsson

We consider topological aspects of decision trees on simplicial complexes, concentrating on how to use decision trees as a tool in topological combinatorics. By Robin Forman's discrete Morse theory, the number of evasive faces of a given dimension $i$ with respect to a decision tree on a simplicial complex is greater than or equal to the $i$th reduced Betti number (over any field) of the complex. Under certain favorable circumstances, a simplicial complex admits an "optimal" decision tree such that equality holds for each $i$; we may hence read off the homology directly from the tree. We provide a recursive definition of the class of semi-nonevasive simplicial complexes with this property. A certain generalization turns out to yield the class of semi-collapsible simplicial complexes that admit an optimal discrete Morse function in the analogous sense. In addition, we develop some elementary theory about semi-nonevasive and semi-collapsible complexes. Finally, we provide explicit optimal decision trees for several well-known simplicial complexes.


2019 ◽  
Vol 66 ◽  
pp. 503-554 ◽  
Author(s):  
Andreas Niskanen ◽  
Johannes Wallner ◽  
Matti Järvisalo

Argumentation is today a topical area of artificial intelligence (AI) research. Abstract argumentation, with argumentation frameworks (AFs) as the underlying knowledge representation formalism, is a central viewpoint to argumentation in AI. Indeed, from the perspective of AI and computer science, understanding computational and representational aspects of AFs is key in the study of argumentation. Realizability of AFs has been recently proposed as a central notion for analyzing the expressive power of AFs under different semantics. In this work, we propose and study the AF synthesis problem as a natural extension of realizability, addressing some of the shortcomings arising from the relatively stringent definition of realizability. In particular, realizability gives means of establishing exact conditions on when a given collection of subsets of arguments has an AF with exactly the given collection as its set of extensions under a specific argumentation semantics. However, in various settings within the study of dynamics of argumentation---including revision and aggregation of AFs---non-realizability can naturally occur. To accommodate such settings, our notion of AF synthesis seeks to construct, or synthesize, AFs that are semantically closest to the knowledge at hand even when no AFs exactly representing the knowledge exist. Going beyond defining the AF synthesis problem, we study both theoretical and practical aspects of the problem. In particular, we (i) prove NP-completeness of AF synthesis under several semantics, (ii) study basic properties of the problem in relation to realizability, (iii) develop algorithmic solutions to NP-hard AF synthesis using the constraint optimization paradigms of maximum satisfiability and answer set programming, (iv) empirically evaluate our algorithms on different forms of AF synthesis instances, as well as (v) discuss variants and generalizations of AF synthesis.


2002 ◽  
Vol 9 (14) ◽  
Author(s):  
Ulrich Berger ◽  
Paulo B. Oliva

We introduce a variant of Spector's bar recursion (called "modified bar recursion'') in finite types to give a realizability interpretation of the classical axiom of countable choice allowing for the extraction of witnesses from proofs of Sigma_1 formulas in classical analysis. As a second application of modified bar recursion we present a bar recursive definition of the fan functional. Moreover, we show that modified bar recursion exists in M (the model of strongly majorizable functionals) and is not S1-S9 computable in C (the model of total functionals). Finally, we show that modified bar recursion defines Spector's bar recursion primitive recursively.


Author(s):  
Bettina Fazzinga ◽  
Sergio Flesca ◽  
Filippo Furfaro

We revisit the notion of i-extension, i.e., the adaption of the fundamental notion of extension to the case of incomplete Abstract Argumentation Frameworks. We show that the definition of i-extension raises some concerns in the "possible" variant, e.g., it allows even conflicting arguments to be collectively considered as members of an (i-)extension. Thus, we introduce the alternative notion of i*-extension overcoming the highlighted problems, and provide a thorough complexity characterization of the corresponding verification problem. Interestingly, we show that the revisitation not only has beneficial effects for the semantics, but also for the complexity: under various semantics, the verification problem under the possible perspective moves from NP-complete to P.


2016 ◽  
Vol 8 (1) ◽  
pp. 41-62
Author(s):  
Imre Kilián

Abstract The backward-chaining inference strategy of Prolog is inefficient for a number of problems. The article proposes Contralog: a Prolog-conform, forward-chaining language and an inference engine that is implemented as a preprocessor-compiler to Prolog. The target model is Prolog, which ensures mutual switching from Contralog to Prolog and back. The Contralog compiler is implemented using Prolog's de facto standardized macro expansion capability. The article goes into details regarding the target model. We introduce first a simple application example for Contralog. Then the next section shows how a recursive definition of some problems is executed by their Contralog definition automatically in a dynamic programming way. Two examples, the well-known matrix chain multiplication problem and the Warshall algorithm are shown here. After this, the inferential target model of Prolog/Contralog programs is introduced, and the possibility for implementing the ReALIS natural language parsing technology is described relying heavily on Contralog's forward chaining inference engine. Finally the article also discusses some practical questions of Contralog program development.


2003 ◽  
Vol 9 (3) ◽  
pp. 273-298 ◽  
Author(s):  
Akihiro Kanamori

For the modern set theorist the empty set Ø, the singleton {a}, and the ordered pair 〈x, y〉 are at the beginning of the systematic, axiomatic development of set theory, both as a field of mathematics and as a unifying framework for ongoing mathematics. These notions are the simplest building locks in the abstract, generative conception of sets advanced by the initial axiomatization of Ernst Zermelo [1908a] and are quickly assimilated long before the complexities of Power Set, Replacement, and Choice are broached in the formal elaboration of the ‘set of’f {} operation. So it is surprising that, while these notions are unproblematic today, they were once sources of considerable concern and confusion among leading pioneers of mathematical logic like Frege, Russell, Dedekind, and Peano. In the development of modern mathematical logic out of the turbulence of 19th century logic, the emergence of the empty set, the singleton, and the ordered pair as clear and elementary set-theoretic concepts serves as amotif that reflects and illuminates larger and more significant developments in mathematical logic: the shift from the intensional to the extensional viewpoint, the development of type distinctions, the logical vs. the iterative conception of set, and the emergence of various concepts and principles as distinctively set-theoretic rather than purely logical. Here there is a loose analogy with Tarski's recursive definition of truth for formal languages: The mathematical interest lies mainly in the procedure of recursion and the attendant formal semantics in model theory, whereas the philosophical interest lies mainly in the basis of the recursion, truth and meaning at the level of basic predication. Circling back to the beginning, we shall see how central the empty set, the singleton, and the ordered pair were, after all.


Sign in / Sign up

Export Citation Format

Share Document