|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
Regional dropout strategies have been proposed to enhance the performance of convolutional neural network classifiers.
#6 best model for Image Classification on CIFAR-100
Modern neural networks are very powerful predictive models, but they are often incapable of recognizing when their predictions may be wrong.
We propose to reinterpret a standard discriminative classifier of p(y|x) as an energy based model for the joint distribution p(x, y).
He et al. (2018) have called into question the utility of pre-training by showing that training from scratch can often yield similar performance to pre-training.
We introduce an out-of-distribution detector that determines whether the video features belong to a seen or unseen action category.
When neural networks process images which do not resemble the distribution seen during training, so called out-of-distribution images, they often make wrong predictions, and do so too confidently.
We present an analysis of predictive uncertainty based out-of-distribution detection for different approaches to estimate various models' epistemic uncertainty and contrast it with extreme value theory based open set recognition.
We present a new method for uncertainty estimation and out-of-distribution detection in neural networks with softmax output.