Structured Knowledge Distillation for Semantic Segmentation

CVPR 2019 Yifan LiuKe ChenChris LiuZengchang QinZhenbo LuoJingdong Wang

In this paper, we investigate the knowledge distillation strategy for training small semantic segmentation networks by making use of large networks. We start from the straightforward scheme, pixel-wise distillation, which applies the distillation scheme adopted for image classification and performs knowledge distillation for each pixel separately... (read more)

PDF Abstract

Evaluation results from the paper

  Submit results from this paper to get state-of-the-art GitHub badges and help community compare results to other papers.