Paper

Evaluating Bregman Divergences for Probability Learning from Crowd

The crowdsourcing scenarios are a good example of having a probability distribution over some categories showing what the people in a global perspective thinks. Learn a predictive model of this probability distribution can be of much more valuable that learn only a discriminative model that gives the most likely category of the data. Here we present differents models that adapts having probability distribution as target to train a machine learning model. We focus on the Bregman divergences framework to used as objective function to minimize. The results show that special care must be taken when build a objective function and consider a equal optimization on neural network in Keras framework.

Results in Papers With Code
(↓ scroll down to see all results)