scholarly journals MedAI at SemEval-2021 Task 10: Negation-aware Pre-training for Source-free Negation Detection Domain Adaptation

Author(s):  
Jinquan Sun ◽  
Qi Zhang ◽  
Yu Wang ◽  
Lei Zhang
2020 ◽  
Vol 27 (4) ◽  
pp. 584-591 ◽  
Author(s):  
Chen Lin ◽  
Steven Bethard ◽  
Dmitriy Dligach ◽  
Farig Sadeque ◽  
Guergana Savova ◽  
...  

Abstract Introduction Classifying whether concepts in an unstructured clinical text are negated is an important unsolved task. New domain adaptation and transfer learning methods can potentially address this issue. Objective We examine neural unsupervised domain adaptation methods, introducing a novel combination of domain adaptation with transformer-based transfer learning methods to improve negation detection. We also want to better understand the interaction between the widely used bidirectional encoder representations from transformers (BERT) system and domain adaptation methods. Materials and Methods We use 4 clinical text datasets that are annotated with negation status. We evaluate a neural unsupervised domain adaptation algorithm and BERT, a transformer-based model that is pretrained on massive general text datasets. We develop an extension to BERT that uses domain adversarial training, a neural domain adaptation method that adds an objective to the negation task, that the classifier should not be able to distinguish between instances from 2 different domains. Results The domain adaptation methods we describe show positive results, but, on average, the best performance is obtained by plain BERT (without the extension). We provide evidence that the gains from BERT are likely not additive with the gains from domain adaptation. Discussion Our results suggest that, at least for the task of clinical negation detection, BERT subsumes domain adaptation, implying that BERT is already learning very general representations of negation phenomena such that fine-tuning even on a specific corpus does not lead to much overfitting. Conclusion Despite being trained on nonclinical text, the large training sets of models like BERT lead to large gains in performance for the clinical negation detection task.


2017 ◽  
Author(s):  
Timothy Miller ◽  
Steven Bethard ◽  
Hadi Amiri ◽  
Guergana Savova

2015 ◽  
Author(s):  
Raghuraman Gopalan ◽  
Ruonan Li ◽  
Vishal M. Patel ◽  
Rama Chellappa

Author(s):  
Masayuki Suzuki ◽  
Ryuki Tachibana ◽  
Samuel Thomas ◽  
Bhuvana Ramabhadran ◽  
George Saon

2020 ◽  
Author(s):  
Hongji Wang ◽  
Heinrich Dinkel ◽  
Shuai Wang ◽  
Yanmin Qian ◽  
Kai Yu

2019 ◽  
Author(s):  
Shota Horiguchi ◽  
Naoyuki Kanda ◽  
Kenji Nagamatsu
Keyword(s):  

2020 ◽  
Vol 155 ◽  
pp. 113404 ◽  
Author(s):  
Peng Liu ◽  
Ting Xiao ◽  
Cangning Fan ◽  
Wei Zhao ◽  
Xianglong Tang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document