semantic encoding
Recently Published Documents


TOTAL DOCUMENTS

87
(FIVE YEARS 6)

H-INDEX

21
(FIVE YEARS 0)

Author(s):  
Haonan Li ◽  
Ehsan Hamzei ◽  
Ivan Majic ◽  
Hua Hua ◽  
Jochen Renz ◽  
...  

Existing question answering systems struggle to answer factoid questions when geospatial information is involved. This is because most systems cannot accurately detect the geospatial semantic elements from the natural language questions, or capture the semantic relationships between those elements. In this paper, we propose a geospatial semantic encoding schema and a semantic graph representation which captures the semantic relations and dependencies in geospatial questions. We demonstrate that our proposed graph representation approach aids in the translation from natural language to a formal, executable expression in a query language. To decrease the need for people to provide explanatory information as part of their question and make the translation fully automatic, we treat the semantic encoding of the question as a sequential tagging task, and the graph generation of the query as a semantic dependency parsing task. We apply neural network approaches to automatically encode the geospatial questions into spatial semantic graph representations. Compared with current template-based approaches, our method generalises to a broader range of questions, including those with complex syntax and semantics. Our proposed approach achieves better results on GeoData201 than existing methods.


PLoS ONE ◽  
2021 ◽  
Vol 16 (4) ◽  
pp. e0248044
Author(s):  
Ross Lawrence ◽  
Xiaoqian J. Chai

Information that is encoded in relation to the self has been shown to be better remembered, yet reports have disagreed on whether the memory benefit from self-referential encoding extends to source memory (the context in which information was learned). In this study, we investigated the self-referential effect on source memory in recollection and familiarity-based memory. Using a Remember/Know paradigm, we compared source memory accuracy under self-referential encoding and semantic encoding. Two types of source information were included, a “peripheral” source which was not inherent to the encoding activity, and a source information about the encoding context. We observed the facilitation in item memory from self-referential encoding compared to semantic encoding in recollection but not in familiarity-based memory. The self-referential benefit to source accuracy was observed in recollection memory, with source memory for the encoding context being stronger in the self-referential condition. No significant self-referential effect was observed with regards to peripheral source information (information not required for the participant to focus on), suggesting not all source information benefit from self-referential encoding. Self-referential encoding also resulted in a higher ratio of “Remember/Know” responses rate than semantically encoded items, denoting stronger recollection. These results suggest self-referential encoding creates a richer, more detailed memory trace which can be recollected later on.


Author(s):  
Xiaoyan Meng ◽  
Guoliang Zhang ◽  
Songmin Jia ◽  
Xiuzhi Li ◽  
Xiangyin Zhang

2019 ◽  
Author(s):  
Ross Lawrence ◽  
Xiaoqian Chai

Self-referential memory encoding has been previously shown to enhance memory. The self-referential facilitation effect has also been found in source memory (memory with contextual details). In this study, we investigated how subjective recollection interacts with the self-referential effect for source memory. Using a remember/know paradigm, we compared source memory accuracy under self-referential encoding and semantic encoding. Two types of source information were included, a “peripheral” source which was not inherent to the encoding activity, and a source information about the encoding context. SRE benefits on source memory accuracy were observed in recollection for both types of source information, but not in familiarity-based memory. In contrast, for familiarity-based memory, semantic encoding resulted in higher source accuracy for the background source compared to self-referential encoding. These results suggest self-referential encoding creates a richer, more detailed memory trace which can be recollected later on.


2018 ◽  
Vol 12 ◽  
Author(s):  
Cara E. Van Uden ◽  
Samuel A. Nastase ◽  
Andrew C. Connolly ◽  
Ma Feilong ◽  
Isabella Hansen ◽  
...  

2018 ◽  
Author(s):  
Cara E. Van Uden ◽  
Samuel A. Nastase ◽  
Andrew C. Connolly ◽  
Ma Feilong ◽  
Isabella Hansen ◽  
...  

Encoding models for mapping voxelwise semantic tuning are typically estimated separately for each individual, limiting their generalizability. In the current report, we develop a method for estimating semantic encoding models that generalize across individuals. Functional MRI was used to measure brain responses while participants freely viewed a naturalistic audiovisual movie. Word embeddings capturing agent-, action-, object-, and scene-related semantic content were assigned to each imaging volume based on an annotation of the film. We constructed both conventional within-subject semantic encoding models and between-subject models where the model was trained on a subset of participants and validated on a left-out participant. Between-subject models were trained using cortical surface-based anatomical normalization or surface-based whole-cortex hyperalignment. We used hyperalignment to project group data into an individual's unique anatomical space via a common representational space, thus leveraging a larger volume of data for out-of-sample prediction while preserving the individual's fine-grained functional–anatomical idiosyncrasies. Our findings demonstrate that anatomical normalization degrades the spatial specificity of between-subject encoding models relative to within-subject models. Hyperalignment, on the other hand, recovers the spatial specificity of semantic tuning lost during anatomical normalization, and yields model performance exceeding that of within-subject models.


2018 ◽  
Vol 35 (6) ◽  
pp. 746-749 ◽  
Author(s):  
KA Honn ◽  
DA Grant ◽  
JM Hinson ◽  
P Whitney ◽  
HPA Van Dongen

Sign in / Sign up

Export Citation Format

Share Document