Embedding Metadata-Enriched Graphs
This paper presents an on-going research where we studythe problem of embedding meta-data enriched graphs, with a focus onknowledge graphs in a vector space with transformer based deep neuralnetworks. Experimentally, we compare ceteris paribus the performance ofa transformer-based model with other non-transformer approaches. Dueto their recent success in natural language processing we hypothesizethat the former is superior in performance. We test this hypothesizesby comparing the performance of transformer embeddings with non-transformer embeddings on different downstream tasks. Our researchmight contribute to a better understanding of how random walks in-fluence the learning of features, which might be useful in the design ofdeep learning architectures for graphs when the input is generated withrandom walks.