Collaborative Distillation is a new knowledge distillation method (named Collaborative Distillation) for encoder-decoder based neural style transfer to reduce the number of convolutional filters. The main idea is underpinned by a finding that the encoder-decoder pairs construct an exclusive collaborative relationship, which is regarded as a new kind of knowledge for style transfer models.
Source: Collaborative Distillation for Ultra-Resolution Universal Style TransferPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Decoder | 2 | 18.18% |
Federated Learning | 1 | 9.09% |
Image Classification | 1 | 9.09% |
Emotion Recognition | 1 | 9.09% |
Multimodal Sentiment Analysis | 1 | 9.09% |
Combinatorial Optimization | 1 | 9.09% |
Meta-Learning | 1 | 9.09% |
Continual Learning | 1 | 9.09% |
Facial Makeup Transfer | 1 | 9.09% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |