In this work, we provide a curriculum learning knowledge distillation framework via instance-level sequence learning. It employs the student network of the early epoch as a snapshot to create a curriculum for the student network's next training phase. We carry out extensive experiments on CIFAR-...
Paper Reading Note: Knowledge Distillation via Instance Relationship Graph,程序员大本营,技术文章内容聚合第一站。
Knowledge Distillation via Instance Relationship Graph Yufan Liu∗a, Jiajiong Cao*b, Bing Li†a, Chunfeng Yuan†a, Weiming Hua, Yangxi Lic and Yunqiang Duanc aNLPR, Institute of Automation, Chinese Academy of Sciences bAnt Financial cNational Computer Network Emergency Response Te...
IRG is an open source implementation of the paper called "Knowledge Distillation via Instance Relationship Graph": Yufan Liu, Jiajiong Cao, Bing Li, Chunfeng Yuan, Weiming Huet al. Knowledge Distillation via Instance Relationship Graph. IEEE Conference on Computer Vision and Pattern Recognition (CVPR...
Affinity Distillation cosine距离表示相似度 backbone: T: resnet50、 S:mobilenetV2 Knowledge Distillation via Instance Relationship Graph (CVPR 2019) Yufan Liu∗a, Jiajiong Cao*b, Bing Li†a, Chunfeng Yuan†a, Weiming Hua, Yangxi Lic and Yunqiang Duanc ...
用在KD上结合的CL方法并不多,Knowledge Distillation via Instance-level Sequence Learning从实例的角度区分难易,使用学生的快照对样本复杂性进行排名,不断增加训练数据的复杂度来获得泛化能力,Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning Strategy通过动态生成伪样本,实现训练难度的调整。这两篇论...
Knowledge distillation improves student model performance. However, using a larger teacher model does not necessarily result in better distillation gains due to significant architecture and output gaps with smaller student networks. To address this issue, we reconsider teacher outputs and find that categor...
Knowledge Distillation via Instance Relationship Graph. Liu, Yufan et al. CVPR 2019 Knowledge Distillation via Route Constrained Optimization. Jin, Xiao et al. ICCV 2019 Similarity-Preserving Knowledge Distillation. Tung, Frederick, and Mori Greg. ICCV 2019 MEAL: Multi-Model Ensemble via Adversarial ...
Liu, Y., Cao, J., Li, B., Yuan, C., Hu, W., Li, Y. & Duan, Y. (2019g). Knowledge distillation via instance relationship graph. InCVPR. Liu, Y., Chen, K., Liu, C., Qin, Z., Luo, Z. & Wang, J. (2019h). Structured knowledge distillation for semantic segmentation. ...
The key challenge of knowledge distillation is to extract general, moderate and sufficient knowledge from a teacher network to guide a student network. In this paper, a novel Instance Relationship Graph (IRG) is proposed for knowledge distillation. It models three kinds of knowledge, including ...