Paper

Sex-Prediction from Periocular Images across Multiple Sensors and Spectra

In this paper, we provide a comprehensive analysis of periocular-based sex-prediction (commonly referred to as gender classification) using state-of-the-art machine learning techniques. In order to reflect a more challenging scenario where periocular images are likely to be obtained from an unknown source, i.e. sensor, convolutional neural networks are trained on fused sets composed of several near-infrared (NIR) and visible wavelength (VW) image databases. In a cross-sensor scenario within each spectrum an average classification accuracy of approximately 85\% is achieved. When sex-prediction is performed across spectra an average classification accuracy of about 82\% is obtained. Finally, a multi-spectral sex-prediction yields a classification accuracy of 83\% on average. Compared to proposed works, obtained results provide a more realistic estimation of the feasibility to predict a subject's sex from the periocular region.

Results in Papers With Code
(↓ scroll down to see all results)