GitHub 中文社区
回车: Github搜索    Shift+回车: Google搜索
论坛
排行榜
趋势
登录

©2025 GitHub中文社区论坛GitHub官网网站地图GitHub官方翻译

  • X iconGitHub on X
  • Facebook iconGitHub on Facebook
  • Linkedin iconGitHub on LinkedIn
  • YouTube iconGitHub on YouTube
  • Twitch iconGitHub on Twitch
  • TikTok iconGitHub on TikTok
  • GitHub markGitHub’s organization on GitHub

编程语言

”knowledge-distillation“ 的搜索结果

awesome-knowledge-distillation
@dkozlov

#计算机科学#Awesome Knowledge Distillation

knowledge-distillationteacher-studentdistillation
3.71 k
1 个月前

相关主题

knowledge-distillationmodel-compressionPyTorchdistillation深度学习机器视觉bert

Google   Bing   GitHub

Awesome-Knowledge-Distillation
@FLHonker

#计算机科学#Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。

distillation深度学习transfer-learning
2.61 k
2 年前
knowledge-distillation-pytorch
@haitongli

A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility

PyTorchknowledge-distillation深度神经网络cifar10model-compression
Python1.95 k
2 年前
Knowledge-Distillation-Zoo
@AberHu

Pytorch implementation of various Knowledge Distillation (KD) methods.

knowledge-distillationteacher-student
Python1.71 k
4 年前
structure_knowledge_distillation
@irfanICMLL

The official code for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL) and extension to other tasks.

Python727
5 年前
knowledge-distillation-papers
@lhyfst

knowledge distillation papers

knowledge-distillationmodel-compressionBukkitreading-list
757
2 年前
RepDistiller
@HobbitLong

[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods

Python2.35 k
2 年前
KnowledgeDistillation
@HoyTta0

Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。

knowledge-distillationbertPyTorchmodel-compressiondistillation
Python225
3 年前
Teacher-free-Knowledge-Distillation
@yuanli2333

Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization

knowledge-distillationPyTorchpaper-implementations
Python584
2 年前
Awesome-Knowledge-Distillation-of-LLMs
@Tebmer

#大语言模型#This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vert...

data-augmentationinstruction-followingknowledge-distillation大语言模型
1.1 k
4 个月前
kdtf
@DushyantaDhyani

Knowledge Distillation using Tensorflow

Python139
6 年前
distill-bert
@kevinmtian

Knowledge Distillation from BERT

Python52
7 年前
SSKD
@xuguodong03

[ECCV2020] Knowledge Distillation Meets Self-Supervision

Python237
3 年前
KD_Lib
@SforAiDl

#计算机科学#A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.

knowledge-distillationmodel-compressionpruningquantizationPyTorch
Python634
2 年前
Karanbir Chahal
distiller
Karanbir Chahal@karanchahal

A large scale study of Knowledge Distillation.

Python209
5 年前
KnowledgeDistillation
@DunZhang

A general framework for knowledge distillation

Python42
5 年前
logit-standardization-KD
@sunshangquan

[CVPR 2024 Highlight] Logit Standardization in Knowledge Distillation

机器视觉cvpr2024knowledge-distillationresnetvision-transformer
Jupyter Notebook375
9 个月前
Knowledge-Distillation
@ujjwal-9

Blog https://medium.com/neuralmachine/knowledge-distillation-dc241d7c2322

Jupyter Notebook60
7 年前
SegDistill
@wzpscott

Knowledge Distillation Toolbox for Semantic Segmentation

Python17
3 年前
TextBrewer
@airaria

#自然语言处理#A PyTorch-based knowledge distillation toolkit for natural language processing

bertPyTorch自然语言处理knowledgedistillation
Python1.66 k
2 年前
RKD
@lenscloth

#计算机科学#Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019

深度神经网络机器视觉metric-learningknowledge-distillation深度学习
Python396
4 年前
FGD
@yzd-v

#计算机科学#Focal and Global Knowledge Distillation for Detectors (CVPR 2022)

object-detectionknowledge-distillation深度学习PyTorch
Python369
3 年前
mmdetection-distiller
@pppppM

This is a knowledge distillation toolbox based on mmdetection.

Python75
4 年前
PKD-for-BERT-Model-Compression
@intersun

pytorch implementation for Patient Knowledge Distillation for BERT Model Compression

gluebert
Python203
6 年前
loading...