Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
Pytorch implementation of various Knowledge Distillation (KD) methods.
The official code for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL) and extension to other tasks.
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
knowledge distillation papers
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
A machine learning experiment
Knowledge Distillation using Tensorflow
Knowledge Distillation from BERT
[ECCV2020] Knowledge Distillation Meets Self-Supervision
A large scale study of Knowledge Distillation.
A general framework for knowledge distillation
Blog https://medium.com/neuralmachine/knowledge-distillation-dc241d7c2322
Knowledge Distillation Toolbox for Semantic Segmentation
A PyTorch-based knowledge distillation toolkit for natural language processing
Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019
This is a knowledge distillation toolbox based on mmdetection.
pytorch implementation for Patient Knowledge Distillation for BERT Model Compression
This is a knowledge distillation toolbox based on mmsegmentation.
Point-to-Voxel Knowledge Distillation for LiDAR Semantic Segmentation (CVPR 2022)