#

knowledge-distillation

https://static.github-zh.com/github_avatars/huawei-noah?size=40

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.

翻译华为诺亚方舟实验室开发的预训练语言模型及其相关优化技术。

Python 3.08 k
1 年前
https://static.github-zh.com/github_avatars/IDEA-Research?size=40

"Effective Whole-body Pose Estimation with Two-stages Distillation" (ICCV 2023, CV4Metaverse Workshop)

Python 2.42 k
1 年前
https://static.github-zh.com/github_avatars/intel?size=40

SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime

Python 2.37 k
1 天前
https://static.github-zh.com/github_avatars/haitongli?size=40

A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility

Python 1.93 k
2 年前
https://static.github-zh.com/github_avatars/microsoft?size=40

This is a collection of our NAS and Vision Transformer work.

翻译[NeurIPS'20]作物的精华:为一击式神经结构搜索提炼优先路径

Python 1.74 k
9 个月前
https://static.github-zh.com/github_avatars/AberHu?size=40

Pytorch implementation of various Knowledge Distillation (KD) methods.

Python 1.68 k
3 年前
https://static.github-zh.com/github_avatars/open-mmlab?size=40

OpenMMLab Model Compression Toolbox and Benchmark.

翻译OpenMMLab 模型压缩工具箱和基准。

Python 1.58 k
10 个月前
yoshitomo-matsubara/torchdistill
https://static.github-zh.com/github_avatars/yoshitomo-matsubara?size=40

#自然语言处理#A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemente...

Python 1.48 k
24 天前
https://static.github-zh.com/github_avatars/microsoft?size=40
Python 1.45 k
2 年前
https://static.github-zh.com/github_avatars/szagoruyko?size=40

#计算机科学#Improving Convolutional Networks via Attention Transfer (ICLR 2017)

Jupyter Notebook 1.45 k
7 年前
https://static.github-zh.com/github_avatars/lxztju?size=40

利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码

Jupyter Notebook 1.41 k
2 年前
https://static.github-zh.com/github_avatars/huawei-noah?size=40
Jupyter Notebook 1.26 k
5 个月前
https://static.github-zh.com/github_avatars/Tebmer?size=40

#大语言模型#This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vert...

986
1 个月前
https://static.github-zh.com/github_avatars/alibaba?size=40

EasyTransfer is designed to make the development of transfer learning in NLP applications easier.

翻译EasyTransfer旨在简化NLP应用程序中的迁移学习的开发。

Python 862
3 年前
loading...
Website
Wikipedia