2021 ◽  
pp. 1-12
Author(s):  
Qinghui Zhang ◽  
Meng Wu ◽  
Pengtao Lv ◽  
Mengya Zhang ◽  
Hongwei Yang

In the medical field, Named Entity Recognition (NER) plays a crucial role in the process of information extraction through electronic medical records and medical texts. To address the problems of long distance entity, entity confusion, and difficulty in boundary division in the Chinese electronic medical record NER task, we propose a Chinese electronic medical record NER method based on the multi-head attention mechanism and character-word fusion. This method uses a new character-word joint feature representation based on the pre-training model BERT and self-constructed domain dictionary, which can accurately divide the entity boundary and solve the impact of unregistered words. Subsequently, on the basis of the BiLSTM-CRF model, a multi-head attention mechanism is introduced to learn the dependency relationship between remote entities and entity information in different semantic spaces, which effectively improves the performance of the model. Experiments show that our models have better performance and achieves significant improvement compared to baselines. The specific performance is that the F1 value on the Chinese electronic medical record data set reaches 95.22%, which is 2.67%higher than the F1 value of the baseline model.


2021 ◽  
Author(s):  
Shen Zhou Feng ◽  
Su Qian Min ◽  
Guo Jing Lei

Abstract The recognition of named entities in Chinese clinical electronic medical records is one of the basic tasks to realize smart medical care. Aiming at the insufficient text semantic representation of the traditional word vector model and the inability of the recurrent neural network (RNN) model to solve the problems of long-term dependence, a Chinese clinical electronic medical record named entity recognition model XLNet-BiLSTM-MHA-CRF based on XLNet is proposed. Use the XLNet pre-training language model as the embedding layer to vectorize the medical record text to solve the problem of ambiguity; use the bidirectional long and short-term memory network (BiLSTM) gate control unit to obtain the forward and backward semantic feature information of the sentence; Then input the feature sequence to the multi-head attention layer (multi-head attention, MHA), use MHA to obtain information represented by different subspaces of the feature sequence, enhance the relevance of context semantics and eliminate noise; finally, input the conditional random field CRF to identify the global maximum 优 sequence. The experimental results show that the XLNet-BiLSTM-Attention-CRF model has achieved good results on the CCKS-2017 named entity recognition data set.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Qiuli Qin ◽  
Shuang Zhao ◽  
Chunmei Liu

Because of difficulty processing the electronic medical record data of patients with cerebrovascular disease, there is little mature recognition technology capable of identifying the named entity of cerebrovascular disease. Excellent research results have been achieved in the field of named entity recognition (NER), but there are several problems in the pre processing of Chinese named entities that have multiple meanings, of which neglecting the combination of contextual information is one. Therefore, to extract five categories of key entity information for diseases, symptoms, body parts, medical examinations, and treatment in electronic medical records, this paper proposes the use of a BERT-BiGRU-CRF named entity recognition method, which is applied to the field of cerebrovascular diseases. The BERT layer first converts the electronic medical record text into a low-dimensional vector, then uses this vector as the input to the BiGRU layer to capture contextual features, and finally uses conditional random fields (CRFs) to capture the dependency between adjacent tags. The experimental results show that the F1 score of the model reaches 90.38%.


Sign in / Sign up

Export Citation Format

Share Document