Not All Regions are Worthy to be Distilled: Region-aware Knowledge Distillation Towards Efficient Image-to-Image Translation

29 Sep 2021  ·  Linfeng Zhang, Kaisheng Ma ·

Recent progress in image-to-image translation has witnessed the success of generative adversarial networks (GANs). However, GANs usually contain a huge number of parameters, which lead to intolerant memory and computation consumption and limit their deployment on edge devices. To address this issue, knowledge distillation is proposed to transfer the knowledge learned by a cumbersome teacher model to an efficient student model. However, previous knowledge distillation methods directly train the student to learn teacher knowledge in all the spatial regions of the images but ignore the fact that in image-to-image translation a large number of regions (e.g. background regions) should not be translated and teacher features in these regions are not worthy to be distilled. To tackle this challenge, in this paper, we propose Region-aware Knowledge Distillation which first localizes the crucial regions in the images with attention mechanism. Then, teacher features in these crucial regions are distilled to students with a region-wise contrastive learning framework. Besides distilling teacher knowledge in features, we further introduce perceptual distillation to distill teacher knowledge in the generated images. Experiments with four comparison methods demonstrate the substantial effectiveness of our method on both paired and unpaired image-to-image translation. For instance, our 7.08X compressed and 6.80X accelerated CycleGAN student outperforms its teacher by 1.36 and 1.16 FID scores on Horse to Zebra and Zebra to Horse, respectively. Codes have been released in the supplementary material and will be released on GitHub soon.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods