scholarly journals ProtoQA: A Question Answering Dataset for Prototypical Common-Sense Reasoning

Author(s):  
Michael Boratko ◽  
Xiang Li ◽  
Tim O’Gorman ◽  
Rajarshi Das ◽  
Dan Le ◽  
...  
Author(s):  
Troels Andreasen ◽  
Henrik Bulskov ◽  
Jørgen Fischer Nilsson

This paper describes principles and structure for a software system that implements a dialect of natural logic for knowledge bases. Natural logics are formal logics that resemble stylized natural language fragments, and whose reasoning rules reflect common-sense reasoning. Natural logics may be seen as forms of extended syllogistic logic. The paper proposes and describes realization of deductive querying functionalities using a previously specified natural logic dialect called Natura-Log. In focus here is the engineering of an inference engine employing as a key feature relational database operations. Thereby the inference steps are subjected to computation in bulk for scaling-up to large knowledge bases. Accordingly, the system eventually is to be realized as a general-purpose database application package with the database being turned logical knowledge base.


Author(s):  
John Horty

The task of formalizing common-sense reasoning within a logical framework can be viewed as an extension of the programme of formalizing mathematical and scientific reasoning that has occupied philosophers throughout much of the twentieth century. The most significant progress in applying logical techniques to the study of common-sense reasoning has been made, however, not by philosophers, but by researchers in artificial intelligence, and the logical study of common-sense reasoning is now a recognized sub-field of that discipline. The work involved in this area is similar to what one finds in philosophical logic, but it tends to be more detailed, since the ultimate goal is to encode the information that would actually be needed to drive a reasoning agent. Still, the formal study of common-sense reasoning is not just a matter of applied logic, but has led to theoretical advances within logic itself. The most important of these is the development of a new field of ‘non-monotonic’ logic, in which the conclusions supported by a set of premises might have to be withdrawn as the premise set is supplemented with new information.


2020 ◽  
Vol 132 ◽  
pp. 53-65
Author(s):  
Min Yang ◽  
Lei Chen ◽  
Ziyu Lyu ◽  
Junhao Liu ◽  
Ying Shen ◽  
...  

2020 ◽  
Vol 34 (05) ◽  
pp. 8082-8090
Author(s):  
Tushar Khot ◽  
Peter Clark ◽  
Michal Guerquin ◽  
Peter Jansen ◽  
Ashish Sabharwal

Composing knowledge from multiple pieces of texts is a key challenge in multi-hop question answering. We present a multi-hop reasoning dataset, Question Answering via Sentence Composition (QASC), that requires retrieving facts from a large corpus and composing them to answer a multiple-choice question. QASC is the first dataset to offer two desirable properties: (a) the facts to be composed are annotated in a large corpus, and (b) the decomposition into these facts is not evident from the question itself. The latter makes retrieval challenging as the system must introduce new concepts or relations in order to discover potential decompositions. Further, the reasoning model must then learn to identify valid compositions of these retrieved facts using common-sense reasoning. To help address these challenges, we provide annotation for supporting facts as well as their composition. Guided by these annotations, we present a two-step approach to mitigate the retrieval challenges. We use other multiple-choice datasets as additional training data to strengthen the reasoning model. Our proposed approach improves over current state-of-the-art language models by 11% (absolute). The reasoning and retrieval problems, however, remain unsolved as this model still lags by 20% behind human performance.


Sign in / Sign up

Export Citation Format

Share Document