美文网首页
知识蒸馏论文

知识蒸馏论文

作者: AAAAAAIIIIII | 来源:发表于2019-12-31 21:26 被阅读0次

Romero, A.; Ballas, N.; Kahou, S. E.; Chassang, A.; Gatta, C.; and Bengio, Y. 2015. Fitnets: Hints for thin deep nets.
In International Conference on Learning Representations (ICLR).

A Comprehensive Overhaul of Feature Distillation 特征图小于0的部分不进行强监督

A Gift from Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning 提出FSP方法,用层与层的特征图之间的关系来蒸馏。

Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons 蒸馏对象是每层的切平面

PAYING MORE ATTENTION TO ATTENTION: IMPROVING THE PERFORMANCE OF CONVOLUTIONAL NEURAL NETWORKS VIA ATTENTION TRANSFER 通过Attention来蒸馏

Distilling the Knowledge in a Neural Network 蒸馏开山作品

Deep Mutual Learning 互相学习

Born Again Neural Networks 再生网络

FitNets: Hints for Thin Deep Nets 学生网络不仅仅拟合教师网络的soft-target,而且拟合隐藏层的输出(教师抽取的特征);

相关文章

网友评论

      本文标题:知识蒸馏论文

      本文链接:https://www.haomeiwen.com/subject/vefmoctx.html