Subclass knowledge distillationSelf distillationDeep neural networksKnowledge Distillation has been established as a highly promising approach for training compact and faster models by transferring knowledge from more heavyweight and powerful models, so as to satisfy the computation and storage requirements of...