Compact Cluster-based Balanced Distribution Adaptation for Transfer Learning

Author(s):  
Xu Zhang ◽  
Zuyu Zhang ◽  
HaeYoung Bae
2019 ◽  
Vol 34 (5) ◽  
pp. 1039-1062 ◽  
Author(s):  
Zhou Xu ◽  
Shuai Pang ◽  
Tao Zhang ◽  
Xia-Pu Luo ◽  
Jin Liu ◽  
...  

Sensors ◽  
2019 ◽  
Vol 20 (1) ◽  
pp. 234
Author(s):  
Ning Cao ◽  
Zhinong Jiang ◽  
Jinji Gao ◽  
Bo Cui

Bearing state recognition, especially under variable working conditions, has the problems of low reusability of monitoring data, low state recognition accuracy and low generalization ability of the model. The feature-based transfer learning method can solve the above problems, but it needs to rely on signal processing knowledge and expert diagnosis experience to obtain the cross-characteristics of different working conditions data in advance. Therefore, this paper proposes an improved balanced distribution adaptation (BDA), named multi-core balanced distribution adaptation (MBDA). This method constructs a weighted mixed kernel function to map different working conditions data to a unified feature space. It does not need to obtain the cross-characteristics of different working conditions data in advance, which simplifies the data processing and meet end-to-end state recognition in practical applications. At the same time, MBDA adopts the A–Distance algorithm to estimate the balance factor of the distribution and the balance factor of the kernel function, which not only effectively reduces the distribution difference between different working conditions data, but also improves efficiency. Further, feature self-learning and rolling bearing state recognition are realized by the stacked autoencoder (SAE) neural network with classification function. The experimental results show that compared with other algorithms, the proposed method effectively improves the transfer learning performance and can accurately identify the bearing state under different working conditions.


Author(s):  
Jindong Wang ◽  
Yiqiang Chen ◽  
Shuji Hao ◽  
Wenjie Feng ◽  
Zhiqi Shen

2019 ◽  
Author(s):  
Qi Yuan ◽  
Alejandro Santana-Bonilla ◽  
Martijn Zwijnenburg ◽  
Kim Jelfs

<p>The chemical space for novel electronic donor-acceptor oligomers with targeted properties was explored using deep generative models and transfer learning. A General Recurrent Neural Network model was trained from the ChEMBL database to generate chemically valid SMILES strings. The parameters of the General Recurrent Neural Network were fine-tuned via transfer learning using the electronic donor-acceptor database from the Computational Material Repository to generate novel donor-acceptor oligomers. Six different transfer learning models were developed with different subsets of the donor-acceptor database as training sets. We concluded that electronic properties such as HOMO-LUMO gaps and dipole moments of the training sets can be learned using the SMILES representation with deep generative models, and that the chemical space of the training sets can be efficiently explored. This approach identified approximately 1700 new molecules that have promising electronic properties (HOMO-LUMO gap <2 eV and dipole moment <2 Debye), 6-times more than in the original database. Amongst the molecular transformations, the deep generative model has learned how to produce novel molecules by trading off between selected atomic substitutions (such as halogenation or methylation) and molecular features such as the spatial extension of the oligomer. The method can be extended as a plausible source of new chemical combinations to effectively explore the chemical space for targeted properties.</p>


2014 ◽  
Author(s):  
Hiroshi Kanayama ◽  
Youngja Park ◽  
Yuta Tsuboi ◽  
Dongmook Yi
Keyword(s):  

2020 ◽  
Author(s):  
Pathikkumar Patel ◽  
Bhargav Lad ◽  
Jinan Fiaidhi

During the last few years, RNN models have been extensively used and they have proven to be better for sequence and text data. RNNs have achieved state-of-the-art performance levels in several applications such as text classification, sequence to sequence modelling and time series forecasting. In this article we will review different Machine Learning and Deep Learning based approaches for text data and look at the results obtained from these methods. This work also explores the use of transfer learning in NLP and how it affects the performance of models on a specific application of sentiment analysis.


2007 ◽  
Author(s):  
Nicholas A. Gorski ◽  
John E. Laird
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document