First, we present a Domain-Specific Contrastive Learning (DSCL) mechanism to fully explore intradomain information by comparing samples only from the same domain.
By combining two fundamental learning approaches in DML, e. g., classification training and pairwise training, we set up a strong baseline for ZS-SBIR.
Large-scale labeled training data is often difficult to collect, especially for person identities.
Specifically, to reconcile the conflicts of multiple objectives, we simplify the standard tightly coupled pipelines and establish a deeply decoupled multi-task learning framework.
In the conventional person Re-ID setting, it is widely assumed that cropped person images are for each individual.
If a sample belongs to a tail class, the corresponding feature cloud will have relatively large distribution range, in compensation to its lack of diversity.
The state-of-the-art methods train the detector individually, and the detected bounding boxes may be sub-optimal for the following re-ID task.