Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
#计算机科学#Awesome Knowledge Distillation
#计算机科学#Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Pytorch implementation of various Knowledge Distillation (KD) methods.
#计算机科学#PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
#自然语言处理#A PyTorch-based knowledge distillation toolkit for natural language processing
推荐/广告/搜索领域工业界经典以及最前沿论文集合。A collection of industry classics and cutting-edge papers in the field of recommendation/advertising/search.
PaddleSlim is an open-source library for deep model compression and architecture search.
翻译 - PaddleSlim是一个用于深度模型压缩和体系结构搜索的开源库。
#计算机科学#The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [TPAMI'23] "ViTPose++: Vision Transformer for Generic Body Pose Estimation"
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.
翻译 - 餐点V2:在没有技巧的情况下将ImageNet上的Vanilla ResNet-50提高到80%+ Top-1准确性
#大语言模型#Prompt engineering for developers
Segmind Distilled diffusion
⚡ Flash Diffusion ⚡: Accelerating Any Conditional Diffusion Model for Few Steps Image Generation (AAAI 2025 Oral)
A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.
#大语言模型#irresponsible innovation. Try now at https://chat.dev/
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joint perception, prediction, planning and control.