Pose2Seg: Detection Free Human Instance Segmentation

The standard approach to image instance segmentation is to perform the object detection first, and then segment the object from the detection bounding-box. More recently, deep learning methods like Mask R-CNN perform them jointly. However, little research takes into account the uniqueness of the "human" category, which can be well defined by the pose skeleton. Moreover, the human pose skeleton can be used to better distinguish instances with heavy occlusion than using bounding-boxes. In this paper, we present a brand new pose-based instance segmentation framework for humans which separates instances based on human pose, rather than proposal region detection. We demonstrate that our pose-based framework can achieve better accuracy than the state-of-art detection-based approach on the human instance segmentation problem, and can moreover better handle occlusion. Furthermore, there are few public datasets containing many heavily occluded humans along with comprehensive annotations, which makes this a challenging problem seldom noticed by researchers. Therefore, in this paper we introduce a new benchmark "Occluded Human (OCHuman)", which focuses on occluded humans with comprehensive annotations including bounding-box, human pose and instance masks. This dataset contains 8110 detailed annotated human instances within 4731 images. With an average 0.67 MaxIoU for each person, OCHuman is the most complex and challenging dataset related to human instance segmentation. Through this dataset, we want to emphasize occlusion as a challenging problem for researchers to study.

PDF Abstract CVPR 2019 PDF CVPR 2019 Abstract


Introduced in the Paper:


Used in the Paper:

Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Human Instance Segmentation OCHuman Pose2Seg (plus ground-truth keypoints) AP 0.552 # 1
Pose Estimation OCHuman Pose2Seg Test AP 23.8 # 13
Keypoint Detection OCHuman Pose2Seg Test AP 23.8 # 9
2D Human Pose Estimation OCHuman Pose2Seg Test AP 23.8 # 10