Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
翻译 - 华为诺亚方舟实验室开发的预训练语言模型及其相关优化技术。
"Effective Whole-body Pose Estimation with Two-stages Distillation" (ICCV 2023, CV4Metaverse Workshop)
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
#自然语言处理#EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
This is a collection of our NAS and Vision Transformer work.
翻译 - [NeurIPS'20]作物的精华:为一击式神经结构搜索提炼优先路径
OpenMMLab Model Compression Toolbox and Benchmark.
翻译 - OpenMMLab 模型压缩工具箱和基准。
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
翻译 - NLP DNN工具包-像玩乐高游戏一样建立NLP DNN模型
#学习与技能提升#Knowledge is a tool for saving, searching, accessing, exploring and chatting with all of your favorite websites, documents and files.
Efficient computing methods developed by Huawei Noah's Ark Lab
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
翻译 - EasyTransfer旨在简化NLP应用程序中的迁移学习的开发。
Segmind Distilled diffusion