Negative-supervised capsule graph neural network for few-shot text classification
Few-shot text classification aims to learn a classifier from very few labeled text data. Existing studies on this topic mainly adopt prototypical networks and focus on interactive information between support set and query instances to learn generalized class prototypes. However, in the process of encoding, these methods only pay attention to the matching information between support set and query instances, and ignore much useful information about intra-class similarity and inter-class dissimilarity between all support samples. Therefore, in this paper we propose a negative-supervised capsule graph neural network (NSCGNN) which explicitly takes use of the similarity and dissimilarity between samples to make the text representations of the same type closer with each other and the ones of different types farther away, leading to representative and discriminative class prototypes. We firstly construct a graph to obtain text representations in the form of node capsules, where both intra-cluster similarity and inter-cluster dissimilarity between all samples are explored with information aggregation and negative supervision. Then, in order to induce generalized class prototypes based on those node capsules obtained from graph neural network, the dynamic routing algorithm is utilized in our model. Experimental results demonstrate the effectiveness of our proposed NSCGNN model, which outperforms existing few-shot approaches on three benchmark datasets.