scholarly journals Genie: a generator of natural language semantic parsers for virtual assistant commands

Author(s):  
Giovanni Campagna ◽  
Silei Xu ◽  
Mehrad Moradshahi ◽  
Richard Socher ◽  
Monica S. Lam
2011 ◽  
Vol 181-182 ◽  
pp. 236-241
Author(s):  
Xian Yi Cheng ◽  
Chen Cheng ◽  
Qian Zhu

As a sort of formalizing tool of knowledge representation, Description Logics have been successfully applied in Information System, Software Engineering and Natural Language processing and so on. Description Logics also play a key role in text representation, Natural Language semantic interpretation and language ontology description. Description Logics have been logical basis of OWL which is an ontology language that is recommended by W3C. This paper discusses the description logic basic ideas under vocabulary semantic, context meaning, domain knowledge and background knowledge.


2022 ◽  
Vol 14 (2) ◽  
pp. 1-24
Author(s):  
Bin Wang ◽  
Pengfei Guo ◽  
Xing Wang ◽  
Yongzhong He ◽  
Wei Wang

Aspect-level sentiment analysis identifies fine-grained emotion for target words. There are three major issues in current models of aspect-level sentiment analysis. First, few models consider the natural language semantic characteristics of the texts. Second, many models consider the location characteristics of the target words, but ignore the relationships among the target words and among the overall sentences. Third, many models lack transparency in data collection, data processing, and results generating in sentiment analysis. In order to resolve these issues, we propose an aspect-level sentiment analysis model that combines a bidirectional Long Short-Term Memory (LSTM) network and a Graph Convolutional Network (GCN) based on Dependency syntax analysis (Bi-LSTM-DGCN). Our model integrates the dependency syntax analysis of the texts, and explicitly considers the natural language semantic characteristics of the texts. It further fuses the target words and overall sentences. Extensive experiments are conducted on four benchmark datasets, i.e., Restaurant14, Laptop, Restaurant16, and Twitter. The experimental results demonstrate that our model outperforms other models like Target-Dependent LSTM (TD-LSTM), Attention-based LSTM with Aspect Embedding (ATAE-LSTM), LSTM+SynATT+TarRep and Convolution over a Dependency Tree (CDT). Our model is further applied to aspect-level sentiment analysis on “government” and “lockdown” of 1,658,250 tweets about “#COVID-19” that we collected from March 1, 2020 to July 1, 2020. The experimental results show that Twitter users’ positive and negative sentiments fluctuated over time. Through the transparency analysis in data collection, data processing, and results generating, we discuss the reasons for the evolution of users’ emotions over time based on the tweets and on our models.


Author(s):  
Patrick Duffley ◽  
Maryse Arseneau

AbstractThis study investigates temporal and control interpretations with verbs of risk followed by non-finite complements in English. It addresses two questions: Why does the gerund-participle show variation in the temporal relation between the event it denotes and that of the main verb whereas the to-infinitive manifests a constant temporal relation? Why does the gerund-participle construction allow variation in control while the to-infinitive shows constant subject control readings? The study is based on a corpus of 1345 attested uses. The explanation is framed in a natural-language semantics involving the meanings of the gerund-participle, the infinitive, the preposition to, and the meaning-relation between the matrix and its complement. Temporal and control interpretations are shown to arise as implications grounded in the semantic content of what is linguistically expressed. It is argued that the capacity of a natural-language semantic approach to account for the data obviates the need to have recourse to purely syntactic operations to account for control.


2019 ◽  
Vol 43 (3) ◽  
pp. 499-532
Author(s):  
Patrick Duffley

Abstract This article argues that the logical paraphrases used to describe the meanings of must, need, may, and can obscure the natural-language semantic interaction between these verbs and negation. The purported non-negatability of must is argued to be an illusion created by the indicative-mood paraphrase ‘is necessary’, which treats the necessity as a reality rather than a non-reality. It is proposed that negation coalesces with the modality that must itself expresses to produce a negatively-charged version of must’s modality: the subject of musn’t is represented as being in a state of constraint in which the only possibility open to the subject is oriented in the opposite direction to the realization of the infinitive’s event. The study also constitutes an argument against a lexicalization analysis: in the combination mustn’t, must and not each contribute their own meaning to the resultant sense, but according to their conceptual status as inherently irrealis notions.


2021 ◽  
pp. 200-207
Author(s):  
Zhu Ping ◽  

Natural language semantic engineering problems are faced with unknown input and intensive knowledge challenges. In order to adapt to the featuresof natural language semantic engineering, the AI programinglanguage needs to be expanded mathematically: 1) Using many ways to improve the spatial distribution and coverage of instances; 2) Keeping different abstract function versions running at the same time; 3) Providing a large numberof knowledge configuration files and supporting functions to deal with intensive knowledge problems; 4) Using the most possibilitypriority call to solve the problem of multiple running branchestraversal. This paper introduces the unknown oriented programming ideas, basic strategy formulation,language design and simulation running examples. It provides a new method for the incremental research and development of large-scale natural language semantic engineeringapplication. Finally, this paper summarizes the full text and puts forward the further research direction.


2018 ◽  
Vol 57 (3) ◽  
pp. 603-619 ◽  
Author(s):  
Suzhen Wang ◽  
Lu Zhang ◽  
Yanpiao Zhang ◽  
Jieli Sun ◽  
Chaoyi Pang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document