Knowledge Distillation

Collaborative Distillation

Introduced by Wang et al. in Collaborative Distillation for Ultra-Resolution Universal Style Transfer

Collaborative Distillation is a new knowledge distillation method (named Collaborative Distillation) for encoder-decoder based neural style transfer to reduce the number of convolutional filters. The main idea is underpinned by a finding that the encoder-decoder pairs construct an exclusive collaborative relationship, which is regarded as a new kind of knowledge for style transfer models.

Source: Collaborative Distillation for Ultra-Resolution Universal Style Transfer

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories