#计算机科学#Awesome Knowledge Distillation
A school management Software
翻译 - 学校管理软件
Pytorch implementation of various Knowledge Distillation (KD) methods.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
#计算机科学# DALI: a large Dataset of synchronised Audio, LyrIcs and vocal notes.
Improving Multi-hop Knowledge Base Question Answering by Learning Intermediate Supervision Signals. WSDM 2021.
[ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Chenyu You, Xiaohui Xie, Zhangyang Wang
It is an online Student-Teacher portal wherein teachers can upload various assignments related to their subjects which the student can download.
College Based Data Management System
PyTorch implementation of "Distilling the Knowledge in a Neural Network"
Deep Neural Network Compression based on Student-Teacher Network
The Pytorch implementation of Graph convolution network (Kipf et.al. 2017) with vanilla Teacher-Student architecture of knowledge distillation (Hinton et.al 2015).
Student Teacher interactive platform
Semi-supervised teacher-student framework
Teaching materials for Procedural Programming Lab
Code for our JSTARS paper "Semi-MCNN: A semisupervised multi-CNN ensemble learning method for urban land cover classification using submeter HRRS images"
REST API in Django using Django REST Framework.
Mobile-first education software for teachers.
[ICLR 2022 workshop PAIR^2Struct] Sparse Logits Suffice to Fail Knowledge Distillation
#计算机科学#The main objective of this repository is to become familiar with the task of Domain Adaptation applied to the Real-time Semantic Segmentation networks.