scholarly journals YNU-HPCC at SemEval-2021 Task 10: Using a Transformer-based Source-Free Domain Adaptation Model for Semantic Processing

Author(s):  
Zhewen Yu ◽  
Jin Wang ◽  
Xuejie Zhang
2021 ◽  
Author(s):  
Egoitz Laparra ◽  
Xin Su ◽  
Yiyun Zhao ◽  
Özlem Uzuner ◽  
Timothy Miller ◽  
...  

Author(s):  
Jogendra Nath Kundu ◽  
Naveen Venkat ◽  
M. V. Rahul ◽  
R. Venkatesh Babu

Methods ◽  
2020 ◽  
Vol 173 ◽  
pp. 69-74 ◽  
Author(s):  
Yongping Du ◽  
Bingbing Pei ◽  
Xiaozheng Zhao ◽  
Junzhong Ji

Author(s):  
Ximei Wang ◽  
Liang Li ◽  
Weirui Ye ◽  
Mingsheng Long ◽  
Jianmin Wang

Recent work in domain adaptation bridges different domains by adversarially learning a domain-invariant representation that cannot be distinguished by a domain discriminator. Existing methods of adversarial domain adaptation mainly align the global images across the source and target domains. However, it is obvious that not all regions of an image are transferable, while forcefully aligning the untransferable regions may lead to negative transfer. Furthermore, some of the images are significantly dissimilar across domains, resulting in weak image-level transferability. To this end, we present Transferable Attention for Domain Adaptation (TADA), focusing our adaptation model on transferable regions or images. We implement two types of complementary transferable attention: transferable local attention generated by multiple region-level domain discriminators to highlight transferable regions, and transferable global attention generated by single image-level domain discriminator to highlight transferable images. Extensive experiments validate that our proposed models exceed state of the art results on standard domain adaptation datasets.


Author(s):  
Zhao-Hua Liu ◽  
Bi-Liang Lu ◽  
Hua-Liang Wei ◽  
Lei Chen ◽  
Xiao-Hua Li ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document