dual subspace
Recently Published Documents


TOTAL DOCUMENTS

12
(FIVE YEARS 6)

H-INDEX

3
(FIVE YEARS 1)

2021 ◽  
Vol 111 ◽  
pp. 107581
Author(s):  
Gregg Belous ◽  
Andrew Busch ◽  
Yongsheng Gao
Keyword(s):  

2020 ◽  
Vol 126 ◽  
pp. 132-142
Author(s):  
Fei Shang ◽  
Huaxiang Zhang ◽  
Jiande Sun ◽  
Liqiang Nie ◽  
Li Liu

2020 ◽  
Vol 103 ◽  
pp. 101786 ◽  
Author(s):  
Ying Chen ◽  
Yibin Tang ◽  
Chun Wang ◽  
Xiaofeng Liu ◽  
Li Zhao ◽  
...  

2019 ◽  
Vol 66 ◽  
pp. 381-410
Author(s):  
Ying Chen ◽  
Zhongzhe Xiao ◽  
Xiaojun Zhang ◽  
Zhi Tao

Traditional machine learning methods share a common hypothesis: training and testing datasets must be in a common feature space with the same distribution. However, in reality, the labeled target data may be rare, so that target space does not share the same feature space or distribution as an available training set (source domain). To address the mismatch of domains, we propose a Dual-Subspace Transfer Learning (DSTL) framework that considers both the common and specific information of the two domains. In DSTL, a latent common subspace is first learned to preserve the data properties and reduce the discrepancy of domains. Then, we propose a mapping strategy to transfer the sourcespecific information to the target subspace. The integration of the domain-common and specific information constructs the proposed DSTL framework. In comparison to the stateart-of works, the main contribution of our work is that the DSTL framework not only considers the commonalities, but also exploits the specific information. Experiments on three emotional speech corpora verify the effectiveness of our approach. The results show that the methods which include both domain-common and specific information perform better than the baseline methods which only exploit the domain commonalities.


2013 ◽  
Vol 5 (5) ◽  
pp. 753-759
Author(s):  
Lijun Liu ◽  
Rendong Ge ◽  
Jiana Meng ◽  
Guangjie You

Sign in / Sign up

Export Citation Format

Share Document