Noisy-Correspondence Learning for Text-to-Image Person Re-identification

19 Aug 2023  ·  Yang Qin, Yingke Chen, Dezhong Peng, Xi Peng, Joey Tianyi Zhou, Peng Hu ·

Text-to-image person re-identification (TIReID) is a compelling topic in the cross-modal community, which aims to retrieve the target person based on a textual query. Although numerous TIReID methods have been proposed and achieved promising performance, they implicitly assume the training image-text pairs are correctly aligned, which is not always the case in real-world scenarios. In practice, the image-text pairs inevitably exist under-correlated or even false-correlated, a.k.a noisy correspondence (NC), due to the low quality of the images and annotation errors. To address this problem, we propose a novel Robust Dual Embedding method (RDE) that can learn robust visual-semantic associations even with NC. Specifically, RDE consists of two main components: 1) A Confident Consensus Division (CCD) module that leverages the dual-grained decisions of dual embedding modules to obtain a consensus set of clean training data, which enables the model to learn correct and reliable visual-semantic associations. 2) A Triplet Alignment Loss (TAL) relaxes the conventional Triplet Ranking loss with the hardest negative samples to a log-exponential upper bound over all negative ones, thus preventing the model collapse under NC and can also focus on hard-negative samples for promising performance. We conduct extensive experiments on three public benchmarks, namely CUHK-PEDES, ICFG-PEDES, and RSTPReID, to evaluate the performance and robustness of our RDE. Our method achieves state-of-the-art results both with and without synthetic noisy correspondences on all three datasets. Code is available at https://github.com/QinYang79/RDE.

PDF Abstract

Results from the Paper


Ranked #2 on Text based Person Retrieval on ICFG-PEDES (using extra training data)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Result Benchmark
Text based Person Retrieval CUHK-PEDES RDE R@1 75.94 # 3
R@10 94.12 # 3
R@5 90.63 # 1
mAP 67.56 # 4
mINP 51.44 # 1
Text based Person Retrieval ICFG-PEDES RDE mAP 40.06 # 3
R@1 67.68 # 2
R@5 82.47 # 1
R@10 87.36 # 1
mINP 7.87 # 2
Text based Person Retrieval RSTPReid RDE R@1 65.35 # 3
R@5 83.95 # 4
R@10 89.90 # 3
mAP 50.88 # 3
mINP 28.08 # 1

Methods