Unsupervised skin tissue segmentation for remote photoplethysmography

Segmentation is a critical step for many algorithms, especially for remote photoplethysmography (rPPG) applications as only the skin surface provides information. Moreover, it has been shown that the rPPG signal is not distributed homogeneously across the skin. Most of the time, algorithms get input information from face detection provided by a supervised learning of physical appearance and skin pixel selection. However, both methods show several limitations. In this paper, we propose a simple approach to implicitly select skin tissues based on their distinct pulsatility feature. The input video frames are decomposed into several temporal superpixels from which the pulse signals are extracted. A pulsatility measure from each temporal superpixel is then used to merge the pulse traces and estimate the photoplethysmogram signal. Since the most pulsatile signals provide high quality information, areas where the information is predominant are favored. We evaluated our contribution using a new publicly available dataset dedicated to rPPG algorithms comparison. The results of our experiments show that our method outperforms state of the art algorithms, without any critical face or skin detection.

PDF
No code implementations yet. Submit your code now

Datasets


Introduced in the Paper:

UBFC-rPPG

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here