I^2R-Net: Intra- and Inter-Human Relation Network for Multi-Person Pose Estimation

In this paper, we present the Intra- and Inter-Human Relation Networks (I^2R-Net) for Multi-Person Pose Estimation. It involves two basic modules. First, the Intra-Human Relation Module operates on a single person and aims to capture Intra-Human dependencies. Second, the Inter-Human Relation Module considers the relation between multiple instances and focuses on capturing Inter-Human interactions. The Inter-Human Relation Module can be designed very lightweight by reducing the resolution of feature map, yet learn useful relation information to significantly boost the performance of the Intra-Human Relation Module. Even without bells and whistles, our method can compete or outperform current competition winners. We conduct extensive experiments on COCO, CrowdPose, and OCHuman datasets. The results demonstrate that the proposed model surpasses all the state-of-the-art methods. Concretely, the proposed method achieves 77.4% AP on CrowPose dataset and 67.8% AP on OCHuman dataset respectively, outperforming existing methods by a large margin. Additionally, the ablation study and visualization analysis also prove the effectiveness of our model.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Pose Estimation COCO I²R-Net (1st stage:HRFormer-B) AP 77.3 # 3
AR 82.1 # 1
AP50 91 # 2
AP75 83.6 # 2
APM 73 # 3
APL 84.5 # 2
Multi-Person Pose Estimation CrowdPose I²R-Net (1st stage: HRFormer-B) mAP @0.5:0.95 77.4 # 2
AP Easy 83.8 # 2
AP Medium 78.1 # 2
AP Hard 69.3 # 2
Multi-Person Pose Estimation OCHuman I²R-Net (1st stage:TransPose-H) Validation AP 67.8 # 2
AP50 85 # 2
AP75 72.8 # 2


No methods listed for this paper. Add relevant methods here