scholarly journals A graph neural network fused with multi-head attention for text classification

2021 ◽  
Vol 2132 (1) ◽  
pp. 012032
Author(s):  
Bing Ai ◽  
Yibing Wang ◽  
Liang Ji ◽  
Jia Yi ◽  
Ting Wang ◽  
...  

Abstract Graph neural network (GNN) has done a good job of processing intricate architecture and fusion of global messages, research has explored GNN technology for text classification. However, the model that fixed the entire corpus as a graph in the past faced many problems such as high memory consumption and the inability to modify the construction of the graph. We propose an improved model based on GNN to solve these problems. The model no longer fixes the entire corpus as a graph but constructs different graphs for each text. This method reduces memory consumption, but still retains global information. We conduct experiments on the R8, R52, and 20newsgroups data sets, and use accuracy as the experimental standard. Experiments show that even if it consumes less memory, our model accomplish higher than existing models on multiple text classification data sets.

2021 ◽  
Vol 2137 (1) ◽  
pp. 012052
Author(s):  
Bingxin Xue ◽  
Cui Zhu ◽  
Xuan Wang ◽  
Wenjun Zhu

Abstract Recently, Graph Convolutional Neural Network (GCN) is widely used in text classification tasks, and has effectively completed tasks that are considered to have a rich relational structure. However, due to the sparse adjacency matrix constructed by GCN, GCN cannot make full use of context-dependent information in text classification, and cannot capture local information. The Bidirectional Encoder Representation from Transformers (BERT) has been shown to have the ability to capture the contextual information in a sentence or document, but its ability to capture global information about the vocabulary of a language is relatively limited. The latter is the advantage of GCN. Therefore, in this paper, Mutual Graph Convolution Networks (MGCN) is proposed to solve the above problems. It introduces semantic dictionary (WordNet), dependency and BERT. MGCN uses dependency to solve the problem of context dependence and WordNet to obtain more semantic information. Then the local information generated by BERT and the global information generated by GCN are interacted through the attention mechanism, so that they can influence each other and improve the classification effect of the model. The experimental results show that our model is more effective than previous research reports on three text classification data sets.


2021 ◽  
Vol 2137 (1) ◽  
pp. 012060
Author(s):  
Ping He ◽  
Yong Li ◽  
Shoulong Chen ◽  
Hoghua Xu ◽  
Lei Zhu ◽  
...  

Abstract In order to realize transformer voiceprint recognition, a transformer voiceprint recognition model based on Mel spectrum convolution neural network is proposed. Firstly, the transformer core looseness fault is simulated by setting different preloads, and the sound signals under different preloads are collected; Secondly, the sound signal is converted into a spectrogram that can be trained by convolutional neural network, and then the dimension is reduced by Mel filter bank to draw Mel spectrogram, which can generate spectrogram data sets under different preloads in batch; Finally, the data set is introduced into convolutional neural network for training, and the transformer voiceprint fault recognition model is obtained. The results show that the training accuracy of the proposed Mel spectrum convolution neural network transformer identification model is 99.91%, which can well identify the core loosening faults.


Author(s):  
Tien Thanh Thach ◽  
Radim Bris

The newly modified Weibull distribution defined in the literature is a model based on combining the Weibull and modified Weibull distributions. It has been demonstrated as the best model for fitting to the bathtub-shaped failure rate data sets. However, another new model based on combining the modified Weibull and Gompertz distributions has been demonstrated later to be even better than the first model. In this article, we have shown how to improve the former model into a better model, and more importantly, we have provided a full Bayesian analysis of the improved model. The Hamiltonian Monte Carlo and cross-entropy methods have been exploited to empower the traditional methods of statistical estimation. Bayes estimators have been obtained using Hamiltonian Monte Carlo for posterior simulations. Bayesian model checking has also been provided in order to check the validation of the model when fitting to real data sets. We have also provided the maximum likelihood estimators of the model parameters using the cross-entropy method to optimize the log-likelihood function. The results derived from the analysis of two well-known data sets show that the improved model is much better than its original form.


Author(s):  
Екатерина Попова ◽  
Ekaterina Popova ◽  
Владимир Спицын ◽  
Vladimir Spicyn ◽  
Юлия Иванова ◽  
...  

The article is devoted to neural network text classification algorithms. The relevance of this topic is due to the ever-growing volume of information on the Internet and the need to navigate it. In this paper, in addition to the classification algorithm, a description is also given of the methods of text preprocessing and vectorization, these steps are the starting point for most NLP tasks and make neural network algorithms efficient on small data sets. In the work, a sampling of 50,000 English IMDB movie reviews will be used as a dataset for training and testing the neural network. To solve this problem, an approach based on the use of a convolutional neural network was used. The maximum achieved accuracy for the test sample was 90.16%.


Geophysics ◽  
2004 ◽  
Vol 69 (4) ◽  
pp. 994-1004 ◽  
Author(s):  
Li‐Yun Fu

I propose a joint inversion scheme to integrate seismic data, well data, and geological knowledge for acoustic impedance estimation. I examine the problem of recovering acoustic impedance from band‐limited seismic data. Optimal estimation of impedance can be achieved by combined applications of model‐based and deconvolution‐based methods. I incorporate the Robinson seismic convolutional model (RSCM) into the Caianiello neural network for network mapping. The Caianiello neural network provides an efficient approach to decompose the seismic wavelet and its inverse. The joint inversion consists of four steps: (1) multistage seismic inverse wavelets (MSIW) extraction at the wells, (2) the deconvolution with MSIW for initial impedance estimation, (3) multistage seismic wavelets (MSW) extraction at the wells, and (4) the model‐based reconstruction of impedance with MSW for improving the initial impedance model. The Caianiello neural network offers two algorithms for the four‐step process: neural wavelet estimation and input signal reconstruction. The frequency‐domain implementation of the algorithms enables control of the inversion on different frequency scales and facilitates an understanding of reservoir behavior on different resolution scales. The test results show that, with well control, the joint inversion can significantly improve the spatial description of reservoirs in data sets involving complex continental deposits.


2019 ◽  
Vol 13 (1-2) ◽  
pp. 95-115
Author(s):  
Brandon Plewe

Historical place databases can be an invaluable tool for capturing the rich meaning of past places. However, this richness presents obstacles to success: the daunting need to simultaneously represent complex information such as temporal change, uncertainty, relationships, and thorough sourcing has been an obstacle to historical GIS in the past. The Qualified Assertion Model developed in this paper can represent a variety of historical complexities using a single, simple, flexible data model based on a) documenting assertions of the past world rather than claiming to know the exact truth, and b) qualifying the scope, provenance, quality, and syntactics of those assertions. This model was successfully implemented in a production-strength historical gazetteer of religious congregations, demonstrating its effectiveness and some challenges.


Sign in / Sign up

Export Citation Format

Share Document