Semi-Supervised Knowledge Distillation is a type of knowledge distillation for person re-identification that exploits weakly annotated data by assigning soft pseudo labels to YouTube-Human to improve models' generalization ability. SSKD first trains a student model (e.g. ResNet-50) and a teacher model (e.g. ResNet-101) using labeled data from multi-source domain datasets. Then, SSKD develops an auxiliary classifier to imitate the soft predictions of unlabeled data generated by the teacher model. Meanwhile, the student model is also supervised by hard labels and predicted soft labels by the teacher model for labeled data.
Source: Semi-Supervised Domain Generalizable Person Re-IdentificationPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Generalizable Person Re-identification | 1 | 50.00% |
Person Re-Identification | 1 | 50.00% |
Component | Type |
|
---|---|---|
Auxiliary Classifier
|
Miscellaneous Components | |
ResNet
|
Convolutional Neural Networks |