Browse > Computer Vision > Image Classification

# Image Classification Edit

472 papers with code · Computer Vision

# LIP: Local Importance-based Pooling

12 Aug 2019sebgao/LIP

Spatial downsampling layers are favored in convolutional neural networks (CNNs) to downscale feature maps for larger receptive fields and less memory consumption.

47
12 Aug 2019

# AutoGAN: Neural Architecture Search for Generative Adversarial Networks

11 Aug 2019TAMU-VITA/AutoGAN

Neural architecture search (NAS) has witnessed prevailing success in image classification and (very recently) segmentation tasks.

100
11 Aug 2019

# On the Variance of the Adaptive Learning Rate and Beyond

8 Aug 2019namisan/mt-dnn

The learning rate warmup heuristic achieves remarkable success in stabilizing training, accelerating convergence and improving generalization for adaptive stochastic optimization algorithms like RMSprop and Adam.

943
08 Aug 2019

# Pseudo-Labeling and Confirmation Bias in Deep Semi-Supervised Learning

8 Aug 2019EricArazo/PseudoLabeling

Semi-supervised learning, i. e. jointly learning from labeled an unlabeled samples, is an active research topic due to its key role on relaxing human annotation constraints.

5
08 Aug 2019

In addition to this dataset, we disseminate an additional real world handwritten dataset (with $10k$ images), which we term as the Dig-MNIST dataset that can serve as an out-of-domain test dataset.

31
03 Aug 2019

# Quality Assessment of In-the-Wild Videos

1 Aug 2019lidq92/VSFA

We propose an objective no-reference video quality assessment method by integrating both effects into a deep neural network.

8
01 Aug 2019

# Compact Global Descriptor for Neural Networks

23 Jul 2019HolmesShuan/Compact-Global-Descriptor

Long-range dependencies modeling, widely used in capturing spatiotemporal correlation, has shown to be effective in CNN dominated computer vision tasks.

10
23 Jul 2019

# MixConv: Mixed Depthwise Convolutional Kernels

22 Jul 2019tensorflow/tpu

In this paper, we systematically study the impact of different kernel sizes, and observe that combining the benefits of multiple kernel sizes can lead to better accuracy and efficiency.

2,313
22 Jul 2019

# Lookahead Optimizer: k steps forward, 1 step back

The vast majority of successful deep neural networks are trained using variants of stochastic gradient descent (SGD) algorithms.

123
19 Jul 2019