scholarly journals Faculty Opinions recommendation of BioBERT: a pre-trained biomedical language representation model for biomedical text mining.

Author(s):  
Anália Lourenço
Author(s):  
Jinhyuk Lee ◽  
Wonjin Yoon ◽  
Sungdong Kim ◽  
Donghyeon Kim ◽  
Sunkyu Kim ◽  
...  

Abstract Motivation Biomedical text mining is becoming increasingly important as the number of biomedical documents rapidly grows. With the progress in natural language processing (NLP), extracting valuable information from biomedical literature has gained popularity among researchers, and deep learning has boosted the development of effective biomedical text mining models. However, directly applying the advancements in NLP to biomedical text mining often yields unsatisfactory results due to a word distribution shift from general domain corpora to biomedical corpora. In this article, we investigate how the recently introduced pre-trained language model BERT can be adapted for biomedical corpora. Results We introduce BioBERT (Bidirectional Encoder Representations from Transformers for Biomedical Text Mining), which is a domain-specific language representation model pre-trained on large-scale biomedical corpora. With almost the same architecture across tasks, BioBERT largely outperforms BERT and previous state-of-the-art models in a variety of biomedical text mining tasks when pre-trained on biomedical corpora. While BERT obtains performance comparable to that of previous state-of-the-art models, BioBERT significantly outperforms them on the following three representative biomedical text mining tasks: biomedical named entity recognition (0.62% F1 score improvement), biomedical relation extraction (2.80% F1 score improvement) and biomedical question answering (12.24% MRR improvement). Our analysis results show that pre-training BERT on biomedical corpora helps it to understand complex biomedical texts. Availability and implementation We make the pre-trained weights of BioBERT freely available at https://github.com/naver/biobert-pretrained, and the source code for fine-tuning BioBERT available at https://github.com/dmis-lab/biobert.


2020 ◽  
Vol 29 (01) ◽  
pp. 225-225

Guan J, Li R, Yu S, Zhang X. A Method for Generating Synthetic Electronic Medical Record Text. IEEE/ACM Transact on Comput Biology and Inform 2019 https://ieeexplore.ieee.org/document/8880542 Lee J, Yoon W, Kim S, Kim D, Kim S, Ho So C, Kang J. BioBERT: a pre-trained biomedical language representation model for biomedical text mining. Bioinformatics 2019;36(4):1234-40 https://academic.oup.com/bioinformatics/article/36/4/1234/5566506 Rosemblat G, Fiszman M, Shin D, Kılıçoğlu H. Towards a characterization of apparent contradictions in the biomedical literature using context analysis. J Biomed Inform 2019;98:103275 https://www.sciencedirect.com/science/article/abs/pii/S1532046419301947?via%3Dihub


2020 ◽  
Author(s):  
Ibrahim Burak Ozyurt

AbstractNeural language representation models such as BERT [1] have recently shown state of the art performance in downstream NLP tasks and bio-medical domain adaptation of BERT (Bio-BERT [2]) has shown same behavior on biomedical text mining tasks. However, due to their large model size and resulting increased computational need, practical application of models such as BERT is challenging making smaller models with comparable performance desirable for real word applications. Recently, a new language transformers based language representation model named ELECTRA [3] is introduced, that makes efficient usage of training data in a generative-discriminative neural model setting that shows performance gains over BERT. These gains are especially impressive for smaller models. Here, we introduce a small ELECTRA based model named Bio-ELECTRA that is eight times smaller than BERT BASE and achieves comparable performance on biomedical question answering and yes/no question answer classification tasks. The model is pre-trained from scratch on PubMed abstracts using a consumer grade GPU with only 8GB memory. For biomedical named entity recognition, however, large BERT Base model outperforms both Bio-ELECTRA and ELECTRA-Small++.


2009 ◽  
Vol 5 (12) ◽  
pp. e1000597 ◽  
Author(s):  
Raul Rodriguez-Esteban

Molecules ◽  
2018 ◽  
Vol 23 (5) ◽  
pp. 1028 ◽  
Author(s):  
Yuting Xing ◽  
Chengkun Wu ◽  
Xi Yang ◽  
Wei Wang ◽  
En Zhu ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document