|Trend||Dataset||Best Method||Paper title||Paper||Code||Compare|
It is claimed that Deep Neural Networks in general have good generalization capabilities since they not only learn how to map from an input to an output but also how to compress information about the training data input (Schwartz-Ziv & Tishby, 2017).
We examine how recently documented, fundamental phenomena in deep learn-ing models subject to pruning are affected by changes in the pruning procedure.
Artificial neural network has achieved unprecedented success in a wide variety of domains such as classifying, predicting and recognizing objects.
The confusion matrix is an established way for visualizing these class errors, but it was not designed with temporal or comparative analysis in mind.
Today's deep neural networks require substantial computation resources for their training, storage and inference, which limits their effective use on resource-constrained devices.
Convolutional neural network (CNN) delivers impressive achievements in computer vision and machine learning field.