Object-Detection-Knowledge-Distillation-ICLR2021 The official implementation of ICLR2021 paper "Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and Efficient Detectors". Please refer to thesupplementary material in Openreviewfor the codes now. ...
https://github.com/HikariTJU/LD 地址代码2: https://github.com/open-mmlab/mmdetection/tree/master/configs/ld 方法概括:把用于分类 head 的 KD(知识蒸馏),用于目标检测的定位 head,即有了 LD(Localization Distillation)。 做法:先把 bbox 的 4 个 logits 输出值,离散化成 4n 个 logits 输出值,之后与...
knowledge distillation effectively improves the performance of small models by transferring the dark knowledge from the teacher detector. However, most of the existing distillation-based detection methods mainly imitating features near bounding boxes, which suffer from two limitations. First, they ignore th...
https://github.com/wangbingnan136/Knowledge-Distillation-Zoo https://github.com/HobbitLong/RepDistiller这个不错,实现的很完整。 https://github.com/bhheo/BSS_distillation https://github.com/clovaai/overhaul-distillation https://github.com/passalis/probabilistic_kt GitHub - lenscloth/RKD: Official ...
论文地址:Localization Distillation for Object Detection,TPAMI 2023. 代码地址1:GitHub - HikariTJU/LD 代码地址2:GitHub - Zzh-tju/Rotated-LD 极市直播回放-bilibilib23.tv/6qaSk9p 目标检测-定位蒸馏 (LD, CVPR 2022)186 赞同 · 62 评论文章 ...
The proposed capsule network extracts object-part information by using convolutional capsules with locally-connected routing and predicts the final salient map based on the deconvolutional capsules. Experimental results on four RGB-D benchmark datasets show that our proposed method outperforms 23 state-...
Therefore, we conjecture that supplying the refinements to the feature-map distillation can be a break- through in the auxiliary network-based self-knowledge dis- tillation. Hence, we suggest the auxiliary self-teacher net- work to generate a refined fe...
A Classification Error Impurity (CEI) algorithm is proposed as a frequency-based filter ranker. • An Adaptive Genetic Algorithm with an External Repository (AGAwER) is proposed as a wrapper method to augment the exploration of GA. • It is explored that the ensemble of the top features ob...
In this work, we explore multiteacher distillation for aggregating knowledge from four diverse image enhancements to provide multitask student models with robustness and generalization. 3. Methodology 3.1. Overview The network proposed in this paper is based on the U-Net architecture with denoising ...
Thus, we describe several methods based on memory replay to deal with the catastrophic forgetting problem. We group these methods according to the problem they address. Less forgetting. Knowledge distillation [48] was introduced as a regularizer on the outputs of a reference network and a new ...