computational semantics
Recently Published Documents


TOTAL DOCUMENTS

93
(FIVE YEARS 16)

H-INDEX

8
(FIVE YEARS 0)

Author(s):  
Du Duc Nguyen ◽  
Phong Dinh Pham

Fuzzy Rule-Based Classifier (FRBC) design problem has been widely studied due to many practical applications. Hedge Algebras based Classifier Design Methods (HACDMs) are the outstanding and effective approaches because these approaches based on a mathematical formal formalism allowing the fuzzy sets based computational semantics generated from their inherent qualitative semantics of linguistic terms. HACDMs include two phase optimization process. The first phase is to optimize the semantic parameter values by applying an optimization algorithm. Then, in the second phase, the optimal fuzzy rule based system for FRBC is extracted based on the optimal semantic parameter values provided by the first phase. The performance of FRBC design methods depends on the quality of the applied optimization algorithms. This paper presents our proposed co-optimization Particle Swarm Optimization (PSO) algorithm for designing FRBC with trapezoidal fuzzy sets based computational semantics generated by Enlarged Hedge Algebras (EHAs). The results of experiments executed over 23 real world datasets have shown that Enlarged Hedge Algebras based classifier with our proposed co-optimization PSO algorithm outperforms the existing classifiers which are designed based on Enlarged Hedge Algebras methodology with two phase optimization process and the existing fuzzy set theory based classifiers.


2021 ◽  
Author(s):  
Maria Francisca Alonso-Sanchez ◽  
Sabrina D Ford ◽  
Michael MacKinley ◽  
Angelica M Silva ◽  
Roberto Limongi ◽  
...  

Computational semantics, a branch of computational linguistics, involves automated meaning analysis that relies on how words occur together in natural language. This offers a promising tool to study schizophrenia. At present, we do not know if these word level choices in speech are sensitive to illness stage (i.e. acute untreated vs. stable established state), track cognitive deficits in major domains (e.g. cognitive control, processing speed) and relate to established dimensions of formal thought disorder. Here we study samples of descriptive discourse in patients with untreated first episode of schizophrenia (x̅ 2.8 days of lifetime daily dose exposure) and healthy subjects (246 samples of 1-minute speech; n=82, FES=46, HC=36) using a co-occurrence based vector embedding of words. We obtained six-month follow-up data in a subsample (99 speech samples, n=33, FES=20, HC=13). At baseline, the evidence for higher semantic similarity during descriptive discourse in FES was substantial, compared to null difference ( Bayes Factor =6 for full description; 32 for 10-words window). Moreover, the was a linear increase in semantic similarity with time in FES compared to HC (Bayes Factor= 6). Higher semantic similarity related to lower Stroop performance (accuracy and interference, response time), and was present irrespective of the severity of clinically ascertained thought disorder. Automated analysis of non-intrusive 1-minute speech samples provides a window on cognitive control deficits, role functioning and tracks latent progression in schizophrenia.


2021 ◽  
Vol 9 (3A) ◽  
Author(s):  
Salah Alnajem ◽  
◽  
A. M. Mutawa ◽  
Hanan AlMeer ◽  
Aseel AlQemlas ◽  
...  

This paper introduces a computational approach to Arabic syntax. The approach uses the Lexical Functional Grammar (LFG) framework. Semantic networks and frames were used to handle computational semantics using lambda notation. This was implemented in Prolog using Definite Clause Grammar (DCG) as a formalism for analyzing and generating syntactic structure.


Studia Logica ◽  
2021 ◽  
Author(s):  
Cosimo Perini Brogi

AbstractThis paper introduces a natural deduction calculus for intuitionistic logic of belief$$\mathsf {IEL}^{-}$$ IEL - which is easily turned into a modal$$\lambda $$ λ -calculus giving a computational semantics for deductions in $$\mathsf {IEL}^{-}$$ IEL - . By using that interpretation, it is also proved that $$\mathsf {IEL}^{-}$$ IEL - has good proof-theoretic properties. The correspondence between deductions and typed terms is then extended to a categorical semantics for identity of proofs in $$\mathsf {IEL}^{-}$$ IEL - showing the general structure of such a modality for belief in an intuitionistic framework.


Author(s):  
Satyen M. Parikh ◽  
Mitali K. Shah

A utilization of the computational semantics is known as natural language processing or NLP. Any opinion through attitude, feelings, and thoughts can be identified as sentiment. The overview of people against specific events, brand, things, or association can be recognized through sentiment analysis. Positive, negative, and neutral are each of the premises that can be grouped into three separate categories. Twitter, the most commonly used microblogging tool, is used to gather information for research. Tweepy is used to access Twitter's source of information. Python language is used to execute the classification algorithm on the information collected. Two measures are applied in sentiment analysis, namely feature extraction and classification. Using n-gram modeling methodology, the feature is extracted. Through a supervised machine learning algorithm, the sentiment is graded as positive, negative, and neutral. Support vector machine (SVM) and k-nearest neighbor (KNN) classification models are used and demonstrated both comparisons.


English Today ◽  
2019 ◽  
Vol 36 (4) ◽  
pp. 33-39
Author(s):  
Yaqian Shi ◽  
Lei Lei

Semantic shifts have been explored via a range of methods (Allan & Robinson 2012). Typically, semantic shifts were usually noted or described with methods such as a literature review or dictionary checking (e.g. Blank & Koch, 1999; Stockwell & Minkova, 2001; Williams, 1976), which are very labour-intensive and time-consuming methods. Other more recently developed methods involve sociolinguistic interviews (Robinson, 2012; Sandow & Robinson, 2018). However, with the development of large-sized corpora and computational semantics, diachronic semantic shifts have started to be captured in a data-driven way (Kutuzov et al., 2018). Recently, the word embeddings technique (Mikolov et al., 2013) has been proven to be a promising tool for the tracking of semantic shifts (e.g. Hamilton, Leskovec & Jurafsky, 2016a, 2016b; Kulkarni et al., 2015; Kutuzov et al., 2017). For example, Hamilton et al. (2016b) exemplified how to use the technique to capture the subjectification process of the word ‘actually’ during the 20th century.


2019 ◽  
Vol 57 (2) ◽  
pp. 233
Author(s):  
Nguyen Thu Anh ◽  
Tran Thai Son

The real-world-semantics interpretability concept of fuzzy systems introduced in [1] is new for the both methodology and application and is necessary to meet the demand of establishing a mathematical basis to construct computational semantics of linguistic words so that a method developed based on handling the computational semantics of linguistic terms to simulate a human method immediately handling words can produce outputs similar to the one produced by the human method. As the real world of each application problem having its own structure which is described by certain linguistic expressions, this requirement can be ensured by imposing constraints on the interpretation assigning computational objects in the appropriate computational structure to the words so that the relationships between the computational semantics in the computational structure is the image of relationships between the real-world objects described by the word-expressions. This study will discuss more clearly the concept of real-world-semantics interpretability and point out that such requirement is a challenge to the study of the interpretability of fuzzy systems, especially for approaches within the fuzzy set framework. A methodological challenge is that it requires both the computational expression representing a given linguistic fuzzy rule base and an approximate reasoning method working on this computation expression must also preserve the real-world semantics of the application problem. Fortunately, the hedge algebra (HA) based approach demonstrates the expectation that the graphical representation of the rule of fuzzy systems and the interpolation reasoning method on them are able to preserve the real-world semantics of the real-world counterpart of the given application problem.


Sign in / Sign up

Export Citation Format

Share Document