Knowledge Distillation

Collaborative Distillation

Introduced by Wang et al. in Collaborative Distillation for Ultra-Resolution Universal Style Transfer

Collaborative Distillation is a new knowledge distillation method (named Collaborative Distillation) for encoder-decoder based neural style transfer to reduce the number of convolutional filters. The main idea is underpinned by a finding that the encoder-decoder pairs construct an exclusive collaborative relationship, which is regarded as a new kind of knowledge for style transfer models.

Source: Collaborative Distillation for Ultra-Resolution Universal Style Transfer

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Knowledge Distillation 2 33.33%
Meta-Learning 1 16.67%
Continual Learning 1 16.67%
Facial Makeup Transfer 1 16.67%
Style Transfer 1 16.67%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories