Browse State-of-the-Art
Datasets
Methods
More
Newsletter
RC2022
About
Trends
Portals
Libraries
Sign In
Subscribe to the PwC Newsletter
×
Stay informed on the latest trending ML papers with code, research developments, libraries, methods, and datasets.
Read previous issues
Join the community
×
You need to
log in
to edit.
You can
create a new account
if you don't have one.
Or, discuss a change on
Slack
.
Edit Category
×
Description with markdown (optional):
Image
Knowledge Distillation
Edit
General
• 11 methods
Methods
Add a Method
Method
Year
Papers
Knowledge Distillation
Distilling the Knowledge in a Neural Network
2015
2173
Ontology
Ontology-Based Production Simulation with OntologySim
2022
366
SFT
Pre-trained Summarization Distillation
2020
75
STD
Spatial-Channel Token Distillation for Vision MLPs
2022
16
Hydra
Hydra: Preserving Ensemble Diversity for Model Distillation
2020
15
Collaborative Distillation
Collaborative Distillation for Ultra-Resolution Universal Style Transfer
2020
7
Teacher-Tutor-Student Knowledge Distillation
Parser-Free Virtual Try-on via Distilling Appearance Flows
2021
4
OMGD
Online Multi-Granularity Distillation for GAN Compression
2021
2
LFME
Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification
2020
1
SSKD
Semi-Supervised Domain Generalizable Person Re-Identification
2021
1
b2b transfer learning
Transfer Learning in Deep Learning Models for Building Load Forecasting: Case of Limited Data
2023
1