Distilled variant of Whisper for speech recognition. 6x faster, 50% smaller, within 1% word error rate.
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Simplify deployments in Elixir with OTP releases!
翻译 - 使用OTP版本简化Elixir中的部署!
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
#网络爬虫#Custom Selenium Chromedriver | Zero-Config | Passes ALL bot mitigation systems (like Distil / Imperva/ Datadadome / CloudFlare IUAM)
翻译 - 自定义Selenium Chromedriver v88起|通过所有Bot缓解系统(例如Distil / Imperva / Datadadome,Botprotect)
Distilabel is a framework for synthetic data and AI feedback for engineers who need fast, reliable and scalable pipelines based on verified research papers.
Pytorch implementation of various Knowledge Distillation (KD) methods.
Scripts to train a bidirectional LSTM with knowledge distillation from BERT
The paper "PSSM-Distil: Protein Secondary Structure Prediction (PSSP) on Low-QualityPSSM by Knowledge Distillation with Contrastive Learning" under of IEEE Conference