A graph neural network fused with multi-head attention for text classification
Abstract Graph neural network (GNN) has done a good job of processing intricate architecture and fusion of global messages, research has explored GNN technology for text classification. However, the model that fixed the entire corpus as a graph in the past faced many problems such as high memory consumption and the inability to modify the construction of the graph. We propose an improved model based on GNN to solve these problems. The model no longer fixes the entire corpus as a graph but constructs different graphs for each text. This method reduces memory consumption, but still retains global information. We conduct experiments on the R8, R52, and 20newsgroups data sets, and use accuracy as the experimental standard. Experiments show that even if it consumes less memory, our model accomplish higher than existing models on multiple text classification data sets.