Calibrating process variation at system level with in-situ low-precision transfer learning for analog neural network processors

Author(s):  
Kaige Jia ◽  
Zheyu Liu ◽  
Qi Wei ◽  
Fei Qiao ◽  
Xinjun Liu ◽  
...  
2021 ◽  
Author(s):  
Jian Hu Jian Hu ◽  
Xianlong Zhang ◽  
Xiaohua Shi

Abstract Deep learning has achieved competing results comparing with human beings in many fields. Traditionally, deep learning networks are executed on CPUs and GPUs. In recent years, more and more Neural Network accelerators have been introduced in both academia and industry to improve the performance and energy efficiency for deep learning networks. In this paper, we introduce a flexible and configurable functional NN accelerator simulator, which could be configured to simulate u-architectures for different NN accelerators. The extensible and configurable simulator is helpful for system-level exploration of u-architecture, as well as operator optimization algorithm developments. We also integrated the simulator into the TVM compilation stack as an optional back-end. Users can use TVM to write operators and execute them on the simulator. The simulator is going to be open sourced.


Mathematics ◽  
2021 ◽  
Vol 9 (17) ◽  
pp. 2039
Author(s):  
Junliang Wang ◽  
Pengjie Gao ◽  
Zhe Li ◽  
Wei Bai

The accurate cycle time (CT) prediction of the wafer fabrication remains a tough task, as the system level of work in process (WIP) is fluctuant. Aiming to construct one unified CT forecasting model under dynamic WIP levels, this paper proposes a transfer learning method for finetuning the predicted neural network hierarchically. First, a two-dimensional (2D) convolutional neural network was constructed to predict the CT under a primary WIP level with the input of spatial-temporal characteristics by reorganizing the input parameters. Then, to predict the CT under another WIP level, a hierarchical optimization transfer learning strategy was designed to finetune the prediction model so as to improve the accuracy of the CT forecasting. The experimental results demonstrated that the hierarchically transfer learning approach outperforms the compared methods in the CT forecasting with the fluctuation of WIP levels.


2019 ◽  
Author(s):  
Qi Yuan ◽  
Alejandro Santana-Bonilla ◽  
Martijn Zwijnenburg ◽  
Kim Jelfs

<p>The chemical space for novel electronic donor-acceptor oligomers with targeted properties was explored using deep generative models and transfer learning. A General Recurrent Neural Network model was trained from the ChEMBL database to generate chemically valid SMILES strings. The parameters of the General Recurrent Neural Network were fine-tuned via transfer learning using the electronic donor-acceptor database from the Computational Material Repository to generate novel donor-acceptor oligomers. Six different transfer learning models were developed with different subsets of the donor-acceptor database as training sets. We concluded that electronic properties such as HOMO-LUMO gaps and dipole moments of the training sets can be learned using the SMILES representation with deep generative models, and that the chemical space of the training sets can be efficiently explored. This approach identified approximately 1700 new molecules that have promising electronic properties (HOMO-LUMO gap <2 eV and dipole moment <2 Debye), 6-times more than in the original database. Amongst the molecular transformations, the deep generative model has learned how to produce novel molecules by trading off between selected atomic substitutions (such as halogenation or methylation) and molecular features such as the spatial extension of the oligomer. The method can be extended as a plausible source of new chemical combinations to effectively explore the chemical space for targeted properties.</p>


Author(s):  
Maolin Wang ◽  
Kelvin C. M. Lee ◽  
Bob M. F. Chung ◽  
Sharatchandra Varma Bogaraju ◽  
Ho-Cheung Ng ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document