Implementation of model compression with knowledge distilling method.
#计算机科学#Papers for deep neural network compression and acceleration
OpenMMLab Model Compression Toolbox and Benchmark.
YOLO ModelCompression MultidatasetTraining
#Awesome#Awesome machine learning model compression research papers, quantization, tools, and learning material.
a list of awesome papers on deep model ompression and acceleration
#人脸识别#Deep Face Model Compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Tool to compress trained caffe weights
PyTorch Model Compression
papers about model compression
Model Compression 1. Pruning(BN Pruning) 2. Knowledge Distillation (Hinton) 3. Quantization (MNN) 4. Deployment (MNN)
📚 Collection of token-level model compression resources.
#计算机科学#A model compression and acceleration toolbox based on pytorch.
Model Compression Toolbox for Large Language Models and Diffusion Models
Geometry based point cloud compression (G-PCC) test model
模型压缩demo(剪枝、量化、知识蒸馏)
#计算机科学#An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.
pytorch implementation for Patient Knowledge Distillation for BERT Model Compression
Video codec based point cloud compression (V-PCC) test model
#大语言模型#Model compression toolkit engineered for enhanced usability, comprehensiveness, and efficiency.
PaddleSlim is an open-source library for deep model compression and architecture search.
Code release for "Adversarial Robustness vs Model Compression, or Both?"