Lets keep it simple, Using simple architectures to outperform deeper and more complex architectures

Major winning Convolutional Neural Networks (CNNs), such as AlexNet, VGGNet, ResNet, GoogleNet, include tens to hundreds of millions of parameters, which impose considerable computation and memory overhead. This limits their practical use for training, optimization and memory efficiency... (read more)

PDF Abstract

Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Image Classification CIFAR-10 SimpleNetv1 Percentage correct 95.51 # 50
Image Classification CIFAR-100 SimpleNetv1 Percentage correct 78.37 # 44
Image Classification MNIST SimpleNetv1 Percentage error 0.25 # 7

Methods used in the Paper


METHOD TYPE
SGD
Stochastic Optimization
Weight Decay
Regularization
SimpleNet
Convolutional Neural Networks
1x1 Convolution
Convolutions
Batch Normalization
Normalization
Convolution
Convolutions
ReLU
Activation Functions
Max Pooling
Pooling Operations
Softmax
Output Functions