Knowledge distillation based on decision boundary instances generated by DBI-GAN

2021 ◽  
Author(s):  
Ziqi Zhu ◽  
Xi Liu ◽  
Chunhua Deng ◽  
Jing Liu ◽  
Jixin Zou
Author(s):  
Byeongho Heo ◽  
Minsik Lee ◽  
Sangdoo Yun ◽  
Jin Young Choi

Many recent works on knowledge distillation have provided ways to transfer the knowledge of a trained network for improving the learning process of a new one, but finding a good technique for knowledge distillation is still an open problem. In this paper, we provide a new perspective based on a decision boundary, which is one of the most important component of a classifier. The generalization performance of a classifier is closely related to the adequacy of its decision boundary, so a good classifier bears a good decision boundary. Therefore, transferring information closely related to the decision boundary can be a good attempt for knowledge distillation. To realize this goal, we utilize an adversarial attack to discover samples supporting a decision boundary. Based on this idea, to transfer more accurate information about the decision boundary, the proposed algorithm trains a student classifier based on the adversarial samples supporting the decision boundary. Experiments show that the proposed method indeed improves knowledge distillation and achieves the state-of-the-arts performance.


2020 ◽  
Author(s):  
Myeongho Jeong ◽  
Seungtaek Choi ◽  
Hojae Han ◽  
Kyungho Kim ◽  
Seung-won Hwang

Author(s):  
Hideki Tsunashima ◽  
Hirokatsu Kataoka ◽  
Junji Yamato ◽  
Qiu Chen ◽  
Shigeo Morishima

Author(s):  
Jieming Zhu ◽  
Jinyang Liu ◽  
Weiqi Li ◽  
Jincai Lai ◽  
Xiuqiang He ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document