#计算机科学#Awesome Knowledge Distillation
#计算机科学#Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
Pytorch implementation of various Knowledge Distillation (KD) methods.
The official code for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL) and extension to other tasks.
knowledge distillation papers
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
#大语言模型#This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vert...
Knowledge Distillation using Tensorflow
Knowledge Distillation from BERT
[ECCV2020] Knowledge Distillation Meets Self-Supervision
#计算机科学#A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
A large scale study of Knowledge Distillation.
A general framework for knowledge distillation
[CVPR 2024 Highlight] Logit Standardization in Knowledge Distillation
Blog https://medium.com/neuralmachine/knowledge-distillation-dc241d7c2322
Knowledge Distillation Toolbox for Semantic Segmentation
#自然语言处理#A PyTorch-based knowledge distillation toolkit for natural language processing
#计算机科学#Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019
#计算机科学#Focal and Global Knowledge Distillation for Detectors (CVPR 2022)
This is a knowledge distillation toolbox based on mmdetection.
pytorch implementation for Patient Knowledge Distillation for BERT Model Compression