PaddleClas 是一个为工业界和学术界所准备的图像识别任务工具集
#计算机科学#Awesome Knowledge Distillation
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
翻译 - 华为诺亚方舟实验室开发的预训练语言模型及其相关优化技术。
#计算机科学#Collection of AWESOME vision-language models for vision tasks
"Effective Whole-body Pose Estimation with Two-stages Distillation" (ICCV 2023, CV4Metaverse Workshop)
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
#自然语言处理#EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
This is a collection of our NAS and Vision Transformer work.
翻译 - [NeurIPS'20]作物的精华:为一击式神经结构搜索提炼优先路径
Pytorch implementation of various Knowledge Distillation (KD) methods.
OpenMMLab Model Compression Toolbox and Benchmark.
翻译 - OpenMMLab 模型压缩工具箱和基准。
#大语言模型#A curated list for Efficient Large Language Models
#自然语言处理#A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemente...
#自然语言处理#NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
翻译 - NLP DNN工具包-像玩乐高游戏一样建立NLP DNN模型
#计算机科学#Improving Convolutional Networks via Attention Transfer (ICLR 2017)
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
Efficient computing methods developed by Huawei Noah's Ark Lab
#计算机科学#Collection of recent methods on (deep) neural network compression and acceleration.
#大语言模型#This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vert...
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
翻译 - EasyTransfer旨在简化NLP应用程序中的迁移学习的开发。