Knowledge Distillation

Teacher-Tutor-Student Knowledge Distillation

Introduced by Ge et al. in Parser-Free Virtual Try-on via Distilling Appearance Flows

Teacher-Tutor-Student Knowledge Distillation is a method for image virtual try-on models. It treats fake images produced by the parser-based method as "tutor knowledge", where the artifacts can be corrected by real "teacher knowledge", which is extracted from the real person images in a self-supervised way. Other than using real images as supervisions, knowledge distillation is formulated in the try-on problem as distilling the appearance flows between the person image and the garment image, enabling the finding of dense correspondences between them to produce high-quality results.

Source: Parser-Free Virtual Try-on via Distilling Appearance Flows


Paper Code Results Date Stars



Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign