Measuring Entity Relatedness via Entity and Text Joint Embedding

2018 ◽  
Vol 50 (2) ◽  
pp. 1861-1875 ◽  
Author(s):  
Weixin Zeng ◽  
Jiuyang Tang ◽  
Xiang Zhao
1985 ◽  
Vol 50 (3) ◽  
pp. 604-610
Author(s):  
Francoise Point

The starting point of this work was Saracino and Wood's description of the finitely generic abelian ordered groups [S-W].We generalize the result of Saracino and Wood to a class ∑UH of subdirect products of substructures of elements of a class ∑, which has some relationships with the discriminator variety V(∑t) generated by ∑. More precisely, let ∑ be an elementary class of L-algebras with theory T. Burris and Werner have shown that if ∑ has a model companion then the existentially closed models in the discriminator variety V(∑t) form an elementary class which they have axiomatized. In general it is not the case that the existentially closed elements of ∑UH form an elementary class. For instance, take for ∑ the class ∑0 of linearly ordered abelian groups (see [G-P]).We determine the finitely generic elements of ∑UH via the three following conditions on T:(1) There is an open L-formula which says in any element of ∑UH that the complement of equalizers do not overlap.(2) There is an existentially closed element of ∑UH which is an L-reduct of an element of V(∑t) and whose L-extensions respect the relationships between the complements of the equalizers.(3) For any models A, B of T, there exists a model C of TUH such that A and B embed in C.(Condition (3) is weaker then “T has the joint embedding property”. It is satisfied for example if every model of T has a one-element substructure. Condition (3) implies that ∑UH has the joint embedding property and therefore that the class of finitely generic elements of ∑UH is complete.)


2014 ◽  
Vol 44 (6) ◽  
pp. 793-804 ◽  
Author(s):  
Chenping Hou ◽  
Feiping Nie ◽  
Xuelong Li ◽  
Dongyun Yi ◽  
Yi Wu

2022 ◽  
Vol 40 (3) ◽  
pp. 1-30
Author(s):  
Procheta Sen ◽  
Debasis Ganguly ◽  
Gareth J. F. Jones

Reducing user effort in finding relevant information is one of the key objectives of search systems. Existing approaches have been shown to effectively exploit the context from the current search session of users for automatically suggesting queries to reduce their search efforts. However, these approaches do not accomplish the end goal of a search system—that of retrieving a set of potentially relevant documents for the evolving information need during a search session. This article takes the problem of query prediction one step further by investigating the problem of contextual recommendation within a search session. More specifically, given the partial context information of a session in the form of a small number of queries, we investigate how a search system can effectively predict the documents that a user would have been presented with had he continued the search session by submitting subsequent queries. To address the problem, we propose a model of contextual recommendation that seeks to capture the underlying semantics of information need transitions of a current user’s search context. This model leverages information from a number of past interactions of other users with similar interactions from an existing search log. To identify similar interactions, as a novel contribution, we propose an embedding approach that jointly learns representations of both individual query terms and also those of queries (in their entirety) from a search log data by leveraging session-level containment relationships. Our experiments conducted on a large query log, namely the AOL, demonstrate that using a joint embedding of queries and their terms within our proposed framework of document retrieval outperforms a number of text-only and sequence modeling based baselines.


2015 ◽  
Vol 65 (4) ◽  
Author(s):  
Tommaso Flaminio ◽  
Matteo Bianchi

AbstractIn this short paper we will discuss on saturated and κ-saturated models of many-valued (t-norm based fuzzy) logics. Using these peculiar structures we show a representation theorem à la Di Nola for several classes of algebras including MV, Gödel, product, BL, NM and WNM-algebras. Then, still using (κ)-saturated algebras, we finally show that some relevant subclasses of algebras related to many-valued logics also enjoy the joint embedding property and the amalgamation property.


2020 ◽  
Vol 30 (9) ◽  
pp. 3226-3237 ◽  
Author(s):  
Jianjun Lei ◽  
Yuxin Song ◽  
Bo Peng ◽  
Zhanyu Ma ◽  
Ling Shao ◽  
...  

2020 ◽  
Vol 34 (03) ◽  
pp. 2950-2958
Author(s):  
Guanglin Niu ◽  
Yongfei Zhang ◽  
Bo Li ◽  
Peng Cui ◽  
Si Liu ◽  
...  

Representation learning on a knowledge graph (KG) is to embed entities and relations of a KG into low-dimensional continuous vector spaces. Early KG embedding methods only pay attention to structured information encoded in triples, which would cause limited performance due to the structure sparseness of KGs. Some recent attempts consider paths information to expand the structure of KGs but lack explainability in the process of obtaining the path representations. In this paper, we propose a novel Rule and Path-based Joint Embedding (RPJE) scheme, which takes full advantage of the explainability and accuracy of logic rules, the generalization of KG embedding as well as the supplementary semantic structure of paths. Specifically, logic rules of different lengths (the number of relations in rule body) in the form of Horn clauses are first mined from the KG and elaborately encoded for representation learning. Then, the rules of length 2 are applied to compose paths accurately while the rules of length 1 are explicitly employed to create semantic associations among relations and constrain relation embeddings. Moreover, the confidence level of each rule is also considered in optimization to guarantee the availability of applying the rule to representation learning. Extensive experimental results illustrate that RPJE outperforms other state-of-the-art baselines on KG completion task, which also demonstrate the superiority of utilizing logic rules as well as paths for improving the accuracy and explainability of representation learning.


Author(s):  
Zhichun Wang ◽  
Rongyu Wang ◽  
Danlu Wen ◽  
Yong Huang ◽  
Chu Li

Sign in / Sign up

Export Citation Format

Share Document