Improving Deep Regression with Ordinal Entropy

21 Jan 2023  ·  Shihao Zhang, Linlin Yang, Michael Bi Mi, Xiaoxu Zheng, Angela Yao ·

In computer vision, it is often observed that formulating regression problems as a classification task often yields better performance. We investigate this curious phenomenon and provide a derivation to show that classification, with the cross-entropy loss, outperforms regression with a mean squared error loss in its ability to learn high-entropy feature representations. Based on the analysis, we propose an ordinal entropy loss to encourage higher-entropy feature spaces while maintaining ordinal relationships to improve the performance of regression tasks. Experiments on synthetic and real-world regression tasks demonstrate the importance and benefits of increasing entropy for regression.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Monocular Depth Estimation NYU-Depth V2 OrdinalEntropy RMSE 0.321 # 25
absolute relative error 0.089 # 25
Delta < 1.25 0.932 # 25
log 10 0.039 # 26
Crowd Counting ShanghaiTech A OrdinalEntropy MAE 65.6 # 15
MSE 105.0 # 7
Crowd Counting ShanghaiTech B OrdinalEntropy MAE 9.1 # 15
MSE 14.5 # 3

Methods


No methods listed for this paper. Add relevant methods here