知识蒸馏中的关键挑战是从一个教师网络中提取通用的、适度的和高效的只是来指导学生网络。本文中,一种新的用于知识蒸馏的实例关系图(IRG)被提出。它模仿三种知识,包括实例特征、实例关系和特征空间转化,同时后面两种知识常常被人们忽略。首先,IRG 通过分别将实例特征和实例关系当作顶点和边被构建来模仿一个网络层被蒸馏...
Knowledge Distillation via Instance Relationship Graph Authors: Yufan Liu, Jiajiong Cao, Bing Li, Chunfeng Yuan, Weiming Hua, Yangxi Lic and Yunqiang Duan Motivation 这篇文章是基于传统KD的改进版,类似于同样发表在CVPR2019的Relational Knowledge Distillation。主要想解决的问题是Knowledge Distillation(KD)中...
地址:https://openaccess.thecvf.com/content_CVPR_2019/papers/Liu_Knowledge_Distillation_via_Instance_Relationship_Graph_CVPR_2019_paper.pdf 发布:CVPR 2019 代码:无 编辑:lzc 作者指出,对于知识蒸馏而言,除了传统的结果和特征图以外,两个输入的特征图之间的“距离”也是可以学习的对象。 对于IRG的学习 IRG的...
In this work, we provide a curriculum learning knowledge distillation framework via instance-level sequence learning. It employs the student network of the early epoch as a snapshot to create a curriculum for the student network's next training phase. We carry out extensive experiments on CIFAR-...
用在KD上结合的CL方法并不多,Knowledge Distillation via Instance-level Sequence Learning从实例的角度区分难易,使用学生的快照对样本复杂性进行排名,不断增加训练数据的复杂度来获得泛化能力,Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning Strategy通过动态生成伪样本,实现训练难度的调整。这两篇论...
Paper Reading Note: Knowledge Distillation via Instance Relationship Graph,程序员大本营,技术文章内容聚合第一站。
The key challenge of knowledge distillation is to extract general, moderate and sufficient knowledge from a teacher network to guide a student network. In this paper, a novel Instance Relationship Graph (IRG) is proposed for knowledge distillation. It models three kinds of knowledge, including inst...
Affinity Distillation cosine距离表示相似度 backbone: T: resnet50、 S:mobilenetV2 Knowledge Distillation via Instance Relationship Graph (CVPR 2019) Yufan Liu∗a, Jiajiong Cao*b, Bing Li†a, Chunfeng Yuan†a, Weiming Hua, Yangxi Lic and Yunqiang Duanc ...
Knowledge Distillation via Instance Relationship Graph. Liu, Yufan et al. CVPR 2019 Knowledge Distillation via Route Constrained Optimization. Jin, Xiao et al. ICCV 2019 Similarity-Preserving Knowledge Distillation. Tung, Frederick, and Mori Greg. ICCV 2019 ...
IRG is an open source implementation of the paper called "Knowledge Distillation via Instance Relationship Graph": Yufan Liu, Jiajiong Cao, Bing Li, Chunfeng Yuan, Weiming Huet al. Knowledge Distillation via Instance Relationship Graph. IEEE Conference on Computer Vision and Pattern Recognition (CVPR...