scholarly journals MusicTM-Dataset for Joint Representation Learning Among Sheet Music, Lyrics, and Musical Audio

Author(s):  
Donghuo Zeng ◽  
Yi Yu ◽  
Keizo Oyama
Author(s):  
Yuqiao Yang ◽  
Xiaoqiang Lin ◽  
Geng Lin ◽  
Zengfeng Huang ◽  
Changjian Jiang ◽  
...  

In this paper, we explore to learn representations of legislation and legislator for the prediction of roll call results. The most popular approach for this topic is named the ideal point model that relies on historical voting information for representation learning of legislators. It largely ignores the context information of the legislative data. We, therefore, propose to incorporate context information to learn dense representations for both legislators and legislation. For legislators, we incorporate relations among them via graph convolutional neural networks (GCN) for their representation learning. For legislation, we utilize its narrative description via recurrent neural networks (RNN) for representation learning. In order to align two kinds of representations in the same vector space, we introduce a triplet loss for the joint training. Experimental results on a self-constructed dataset show the effectiveness of our model for roll call results prediction compared to some state-of-the-art baselines.


2020 ◽  
Vol 385 ◽  
pp. 132-147
Author(s):  
Adrien Lagrange ◽  
Mathieu Fauvel ◽  
Stéphane May ◽  
José Bioucas-Dias ◽  
Nicolas Dobigeon

Author(s):  
Pei Ke ◽  
Haozhe Ji ◽  
Yu Ran ◽  
Xin Cui ◽  
Liwei Wang ◽  
...  

Author(s):  
Han Zhao ◽  
Xu Yang ◽  
Zhenru Wang ◽  
Erkun Yang ◽  
Cheng Deng

By contrasting positive-negative counterparts, graph contrastive learning has become a prominent technique for unsupervised graph representation learning. However, existing methods fail to consider the class information and will introduce false-negative samples in the random negative sampling, causing poor performance. To this end, we propose a graph debiased contrastive learning framework, which can jointly perform representation learning and clustering. Specifically, representations can be optimized by aligning with clustered class information, and simultaneously, the optimized representations can promote clustering, leading to more powerful representations and clustering results. More importantly, we randomly select negative samples from the clusters which are different from the positive sample's cluster. In this way, as the supervisory signals, the clustering results can be utilized to effectively decrease the false-negative samples. Extensive experiments on five datasets demonstrate that our method achieves new state-of-the-art results on graph clustering and classification tasks.


2021 ◽  
Vol 166 ◽  
pp. 113913
Author(s):  
Guang-Yu Zhang ◽  
Yu-Ren Zhou ◽  
Chang-Dong Wang ◽  
Dong Huang ◽  
Xiao-Yu He

Sign in / Sign up

Export Citation Format

Share Document