Structured Knowledge Distillation for Dense Prediction

CVPR 2019 Yifan LiuChangyong ShunJingdong WangChunhua Shen

In this paper, we consider transferring the structure information from large networks to small ones for dense prediction tasks. Previous knowledge distillation strategies used for dense prediction tasks often directly borrow the distillation scheme for image classification and perform knowledge distillation for each pixel separately, leading to sub-optimal performance... (read more)

PDF Abstract

Evaluation Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help community compare results to other papers.