scholarly journals Biomedical Relation Classification by single and multiple source domain adaptation

2019 ◽  
Author(s):  
Sinchani Chakraborty ◽  
Sudeshna Sarkar ◽  
Pawan Goyal ◽  
Mahanandeeshwar Gattu
PLoS ONE ◽  
2021 ◽  
Vol 16 (8) ◽  
pp. e0255754
Author(s):  
Seongmin Lee ◽  
Hyunsik Jeon ◽  
U. Kang

Given multiple source datasets with labels, how can we train a target model with no labeled data? Multi-source domain adaptation (MSDA) aims to train a model using multiple source datasets different from a target dataset in the absence of target data labels. MSDA is a crucial problem applicable to many practical cases where labels for the target data are unavailable due to privacy issues. Existing MSDA frameworks are limited since they align data without considering labels of the features of each domain. They also do not fully utilize the target data without labels and rely on limited feature extraction with a single extractor. In this paper, we propose Multi-EPL, a novel method for MSDA. Multi-EPL exploits label-wise moment matching to align the conditional distributions of the features for the labels, uses pseudolabels for the unavailable target labels, and introduces an ensemble of multiple feature extractors for accurate domain adaptation. Extensive experiments show that Multi-EPL provides the state-of-the-art performance for MSDA tasks in both image domains and text domains, improving the accuracy by up to 13.20%.


PLoS ONE ◽  
2021 ◽  
Vol 16 (7) ◽  
pp. e0253415
Author(s):  
Hyunsik Jeon ◽  
Seongmin Lee ◽  
U Kang

Given trained models from multiple source domains, how can we predict the labels of unlabeled data in a target domain? Unsupervised multi-source domain adaptation (UMDA) aims for predicting the labels of unlabeled target data by transferring the knowledge of multiple source domains. UMDA is a crucial problem in many real-world scenarios where no labeled target data are available. Previous approaches in UMDA assume that data are observable over all domains. However, source data are not easily accessible due to privacy or confidentiality issues in a lot of practical scenarios, although classifiers learned in source domains are readily available. In this work, we target data-free UMDA where source data are not observable at all, a novel problem that has not been studied before despite being very realistic and crucial. To solve data-free UMDA, we propose DEMS (Data-free Exploitation of Multiple Sources), a novel architecture that adapts target data to source domains without exploiting any source data, and estimates the target labels by exploiting pre-trained source classifiers. Extensive experiments for data-free UMDA on real-world datasets show that DEMS provides the state-of-the-art accuracy which is up to 27.5% point higher than that of the best baseline.


2021 ◽  
Author(s):  
bin wang ◽  
Gang Li ◽  
Chao Wu ◽  
WeiShan Zhang ◽  
Jiehan Zhou ◽  
...  

Abstract Unsupervised federated domain adaptation uses the knowledge from several distributed unlabelled source domains to complete the learning on the unlabelled target domain. Some of the existing methods have limited effectiveness and involve frequent communication. This paper proposes a framework to solve the distributed multi-source domain adaptation problem, referred as self-supervised federated domain adaptation (SFDA). Specifically, a multi-domain model generalization balance (MDMGB) is proposed to aggregate the models from multiple source domains in each round of communication. A weighted strategy based on centroid similarity is also designed for SFDA. SFDA conducts self-supervised training on the target domain to tackle domain shift. Compared with the classical federated adversarial domain adaptation algorithm, SFDA is not only strong in communication cost and privacy protection but also improves in the accuracy of the model.


2021 ◽  
Vol 211 ◽  
pp. 106569
Author(s):  
Qiang Zhou ◽  
Wen’an Zhou ◽  
Shirui Wang ◽  
Ying Xing

2020 ◽  
Vol 199 ◽  
pp. 105962 ◽  
Author(s):  
Chaoqi Chen ◽  
Weiping Xie ◽  
Yi Wen ◽  
Yue Huang ◽  
Xinghao Ding

Sign in / Sign up

Export Citation Format

Share Document