# Learning Expectation of Label Distribution for Facial Age and Attractiveness Estimation

3 Jul 2020  ·  , , , , ·

Facial attributes (e.g., age and attractiveness) estimation performance has been greatly improved by using convolutional neural networks. However, existing methods have an inconsistency between the training objectives and the evaluation metric, so they may be suboptimal... In addition, these methods always adopt image classification or face recognition models with a large amount of parameters, which carry expensive computation cost and storage overhead. In this paper, we firstly analyze the essential relationship between two state-of-the-art methods (Ranking-CNN and DLDL) and show that the Ranking method is in fact learning label distribution implicitly. This result thus firstly unifies two existing popular state-of-the-art methods into the DLDL framework. Second, in order to alleviate the inconsistency and reduce resource consumption, we design a lightweight network architecture and propose a unified framework which can jointly learn facial attribute distribution and regress attribute value. The effectiveness of our approach has been demonstrated on both facial age and attractiveness estimation tasks. Our method achieves new state-of-the-art results using the single model with 36$\times$(6$\times$) fewer parameters and 2.6$\times$(2.1$\times$) faster inference speed on facial age (attractiveness) estimation. Moreover, our method can achieve comparable results as the state-of-the-art even though the number of parameters is further reduced to 0.9M (3.8MB disk storage). read more

PDF Abstract

## Results from the Paper Edit

Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Attractiveness Estimation CFD DLDL-v2 (ThinAttNet) MAE 0.364 # 1
Age Estimation ChaLearn 2015 DLDL-v2 (ThinAgeNet) MAE 3.135 # 1
Age Estimation ChaLearn 2016 DLDL-v2 (ThinAgeNet) MAE 3.452 # 1
Age Estimation MORPH Album2 DLDL-v2 (ThinAgeNet) MAE 1.969 # 1
Attractiveness Estimation SCUT-FBP DLDL-v2 (ThinAttNet) MAE 0.212 # 1