Knowledge Distillation

Semi-Supervised Knowledge Distillation

Introduced by He et al. in Semi-Supervised Domain Generalizable Person Re-Identification

Semi-Supervised Knowledge Distillation is a type of knowledge distillation for person re-identification that exploits weakly annotated data by assigning soft pseudo labels to YouTube-Human to improve models' generalization ability. SSKD first trains a student model (e.g. ResNet-50) and a teacher model (e.g. ResNet-101) using labeled data from multi-source domain datasets. Then, SSKD develops an auxiliary classifier to imitate the soft predictions of unlabeled data generated by the teacher model. Meanwhile, the student model is also supervised by hard labels and predicted soft labels by the teacher model for labeled data.

Source: Semi-Supervised Domain Generalizable Person Re-Identification

Papers


Paper Code Results Date Stars

Tasks


Categories